Oct  4 00:28:24 np0005470441 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  4 00:28:24 np0005470441 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  4 00:28:24 np0005470441 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  4 00:28:24 np0005470441 kernel: BIOS-provided physical RAM map:
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  4 00:28:24 np0005470441 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  4 00:28:24 np0005470441 kernel: NX (Execute Disable) protection: active
Oct  4 00:28:24 np0005470441 kernel: APIC: Static calls initialized
Oct  4 00:28:24 np0005470441 kernel: SMBIOS 2.8 present.
Oct  4 00:28:24 np0005470441 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  4 00:28:24 np0005470441 kernel: Hypervisor detected: KVM
Oct  4 00:28:24 np0005470441 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  4 00:28:24 np0005470441 kernel: kvm-clock: using sched offset of 4818369460 cycles
Oct  4 00:28:24 np0005470441 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  4 00:28:24 np0005470441 kernel: tsc: Detected 2800.000 MHz processor
Oct  4 00:28:24 np0005470441 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  4 00:28:24 np0005470441 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  4 00:28:24 np0005470441 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  4 00:28:24 np0005470441 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  4 00:28:24 np0005470441 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  4 00:28:24 np0005470441 kernel: Using GB pages for direct mapping
Oct  4 00:28:24 np0005470441 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  4 00:28:24 np0005470441 kernel: ACPI: Early table checksum verification disabled
Oct  4 00:28:24 np0005470441 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  4 00:28:24 np0005470441 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  4 00:28:24 np0005470441 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  4 00:28:24 np0005470441 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  4 00:28:24 np0005470441 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  4 00:28:24 np0005470441 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  4 00:28:24 np0005470441 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  4 00:28:24 np0005470441 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  4 00:28:24 np0005470441 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  4 00:28:24 np0005470441 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  4 00:28:24 np0005470441 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  4 00:28:24 np0005470441 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  4 00:28:24 np0005470441 kernel: No NUMA configuration found
Oct  4 00:28:24 np0005470441 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  4 00:28:24 np0005470441 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Oct  4 00:28:24 np0005470441 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  4 00:28:24 np0005470441 kernel: Zone ranges:
Oct  4 00:28:24 np0005470441 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  4 00:28:24 np0005470441 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  4 00:28:24 np0005470441 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  4 00:28:24 np0005470441 kernel:  Device   empty
Oct  4 00:28:24 np0005470441 kernel: Movable zone start for each node
Oct  4 00:28:24 np0005470441 kernel: Early memory node ranges
Oct  4 00:28:24 np0005470441 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  4 00:28:24 np0005470441 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  4 00:28:24 np0005470441 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  4 00:28:24 np0005470441 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  4 00:28:24 np0005470441 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  4 00:28:24 np0005470441 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  4 00:28:24 np0005470441 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  4 00:28:24 np0005470441 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  4 00:28:24 np0005470441 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  4 00:28:24 np0005470441 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  4 00:28:24 np0005470441 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  4 00:28:24 np0005470441 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  4 00:28:24 np0005470441 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  4 00:28:24 np0005470441 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  4 00:28:24 np0005470441 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  4 00:28:24 np0005470441 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  4 00:28:24 np0005470441 kernel: TSC deadline timer available
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Max. logical packages:   8
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Max. logical dies:       8
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Max. dies per package:   1
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Max. threads per core:   1
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Num. cores per package:     1
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Num. threads per package:   1
Oct  4 00:28:24 np0005470441 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  4 00:28:24 np0005470441 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  4 00:28:24 np0005470441 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  4 00:28:24 np0005470441 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  4 00:28:24 np0005470441 kernel: Booting paravirtualized kernel on KVM
Oct  4 00:28:24 np0005470441 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  4 00:28:24 np0005470441 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  4 00:28:24 np0005470441 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  4 00:28:24 np0005470441 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  4 00:28:24 np0005470441 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  4 00:28:24 np0005470441 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  4 00:28:24 np0005470441 kernel: random: crng init done
Oct  4 00:28:24 np0005470441 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: Fallback order for Node 0: 0 
Oct  4 00:28:24 np0005470441 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  4 00:28:24 np0005470441 kernel: Policy zone: Normal
Oct  4 00:28:24 np0005470441 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  4 00:28:24 np0005470441 kernel: software IO TLB: area num 8.
Oct  4 00:28:24 np0005470441 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  4 00:28:24 np0005470441 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  4 00:28:24 np0005470441 kernel: ftrace: allocated 193 pages with 3 groups
Oct  4 00:28:24 np0005470441 kernel: Dynamic Preempt: voluntary
Oct  4 00:28:24 np0005470441 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  4 00:28:24 np0005470441 kernel: rcu: #011RCU event tracing is enabled.
Oct  4 00:28:24 np0005470441 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  4 00:28:24 np0005470441 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  4 00:28:24 np0005470441 kernel: #011Rude variant of Tasks RCU enabled.
Oct  4 00:28:24 np0005470441 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  4 00:28:24 np0005470441 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  4 00:28:24 np0005470441 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  4 00:28:24 np0005470441 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  4 00:28:24 np0005470441 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  4 00:28:24 np0005470441 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  4 00:28:24 np0005470441 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  4 00:28:24 np0005470441 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  4 00:28:24 np0005470441 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  4 00:28:24 np0005470441 kernel: Console: colour VGA+ 80x25
Oct  4 00:28:24 np0005470441 kernel: printk: console [ttyS0] enabled
Oct  4 00:28:24 np0005470441 kernel: ACPI: Core revision 20230331
Oct  4 00:28:24 np0005470441 kernel: APIC: Switch to symmetric I/O mode setup
Oct  4 00:28:24 np0005470441 kernel: x2apic enabled
Oct  4 00:28:24 np0005470441 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  4 00:28:24 np0005470441 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  4 00:28:24 np0005470441 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct  4 00:28:24 np0005470441 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  4 00:28:24 np0005470441 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  4 00:28:24 np0005470441 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  4 00:28:24 np0005470441 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  4 00:28:24 np0005470441 kernel: Spectre V2 : Mitigation: Retpolines
Oct  4 00:28:24 np0005470441 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  4 00:28:24 np0005470441 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  4 00:28:24 np0005470441 kernel: RETBleed: Mitigation: untrained return thunk
Oct  4 00:28:24 np0005470441 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  4 00:28:24 np0005470441 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  4 00:28:24 np0005470441 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  4 00:28:24 np0005470441 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  4 00:28:24 np0005470441 kernel: x86/bugs: return thunk changed
Oct  4 00:28:24 np0005470441 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  4 00:28:24 np0005470441 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  4 00:28:24 np0005470441 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  4 00:28:24 np0005470441 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  4 00:28:24 np0005470441 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  4 00:28:24 np0005470441 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  4 00:28:24 np0005470441 kernel: Freeing SMP alternatives memory: 40K
Oct  4 00:28:24 np0005470441 kernel: pid_max: default: 32768 minimum: 301
Oct  4 00:28:24 np0005470441 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  4 00:28:24 np0005470441 kernel: landlock: Up and running.
Oct  4 00:28:24 np0005470441 kernel: Yama: becoming mindful.
Oct  4 00:28:24 np0005470441 kernel: SELinux:  Initializing.
Oct  4 00:28:24 np0005470441 kernel: LSM support for eBPF active
Oct  4 00:28:24 np0005470441 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  4 00:28:24 np0005470441 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  4 00:28:24 np0005470441 kernel: ... version:                0
Oct  4 00:28:24 np0005470441 kernel: ... bit width:              48
Oct  4 00:28:24 np0005470441 kernel: ... generic registers:      6
Oct  4 00:28:24 np0005470441 kernel: ... value mask:             0000ffffffffffff
Oct  4 00:28:24 np0005470441 kernel: ... max period:             00007fffffffffff
Oct  4 00:28:24 np0005470441 kernel: ... fixed-purpose events:   0
Oct  4 00:28:24 np0005470441 kernel: ... event mask:             000000000000003f
Oct  4 00:28:24 np0005470441 kernel: signal: max sigframe size: 1776
Oct  4 00:28:24 np0005470441 kernel: rcu: Hierarchical SRCU implementation.
Oct  4 00:28:24 np0005470441 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  4 00:28:24 np0005470441 kernel: smp: Bringing up secondary CPUs ...
Oct  4 00:28:24 np0005470441 kernel: smpboot: x86: Booting SMP configuration:
Oct  4 00:28:24 np0005470441 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  4 00:28:24 np0005470441 kernel: smp: Brought up 1 node, 8 CPUs
Oct  4 00:28:24 np0005470441 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct  4 00:28:24 np0005470441 kernel: node 0 deferred pages initialised in 29ms
Oct  4 00:28:24 np0005470441 kernel: Memory: 7765644K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616512K reserved, 0K cma-reserved)
Oct  4 00:28:24 np0005470441 kernel: devtmpfs: initialized
Oct  4 00:28:24 np0005470441 kernel: x86/mm: Memory block size: 128MB
Oct  4 00:28:24 np0005470441 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  4 00:28:24 np0005470441 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: pinctrl core: initialized pinctrl subsystem
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  4 00:28:24 np0005470441 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  4 00:28:24 np0005470441 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  4 00:28:24 np0005470441 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  4 00:28:24 np0005470441 kernel: audit: initializing netlink subsys (disabled)
Oct  4 00:28:24 np0005470441 kernel: audit: type=2000 audit(1759552102.851:1): state=initialized audit_enabled=0 res=1
Oct  4 00:28:24 np0005470441 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  4 00:28:24 np0005470441 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  4 00:28:24 np0005470441 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  4 00:28:24 np0005470441 kernel: cpuidle: using governor menu
Oct  4 00:28:24 np0005470441 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  4 00:28:24 np0005470441 kernel: PCI: Using configuration type 1 for base access
Oct  4 00:28:24 np0005470441 kernel: PCI: Using configuration type 1 for extended access
Oct  4 00:28:24 np0005470441 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  4 00:28:24 np0005470441 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  4 00:28:24 np0005470441 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  4 00:28:24 np0005470441 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  4 00:28:24 np0005470441 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  4 00:28:24 np0005470441 kernel: Demotion targets for Node 0: null
Oct  4 00:28:24 np0005470441 kernel: cryptd: max_cpu_qlen set to 1000
Oct  4 00:28:24 np0005470441 kernel: ACPI: Added _OSI(Module Device)
Oct  4 00:28:24 np0005470441 kernel: ACPI: Added _OSI(Processor Device)
Oct  4 00:28:24 np0005470441 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  4 00:28:24 np0005470441 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  4 00:28:24 np0005470441 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  4 00:28:24 np0005470441 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  4 00:28:24 np0005470441 kernel: ACPI: Interpreter enabled
Oct  4 00:28:24 np0005470441 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  4 00:28:24 np0005470441 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  4 00:28:24 np0005470441 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  4 00:28:24 np0005470441 kernel: PCI: Using E820 reservations for host bridge windows
Oct  4 00:28:24 np0005470441 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  4 00:28:24 np0005470441 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  4 00:28:24 np0005470441 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [3] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [4] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [5] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [6] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [7] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [8] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [9] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [10] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [11] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [12] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [13] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [14] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [15] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [16] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [17] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [18] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [19] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [20] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [21] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [22] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [23] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [24] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [25] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [26] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [27] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [28] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [29] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [30] registered
Oct  4 00:28:24 np0005470441 kernel: acpiphp: Slot [31] registered
Oct  4 00:28:24 np0005470441 kernel: PCI host bridge to bus 0000:00
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  4 00:28:24 np0005470441 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  4 00:28:24 np0005470441 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  4 00:28:24 np0005470441 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  4 00:28:24 np0005470441 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  4 00:28:24 np0005470441 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  4 00:28:24 np0005470441 kernel: iommu: Default domain type: Translated
Oct  4 00:28:24 np0005470441 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  4 00:28:24 np0005470441 kernel: SCSI subsystem initialized
Oct  4 00:28:24 np0005470441 kernel: ACPI: bus type USB registered
Oct  4 00:28:24 np0005470441 kernel: usbcore: registered new interface driver usbfs
Oct  4 00:28:24 np0005470441 kernel: usbcore: registered new interface driver hub
Oct  4 00:28:24 np0005470441 kernel: usbcore: registered new device driver usb
Oct  4 00:28:24 np0005470441 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  4 00:28:24 np0005470441 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  4 00:28:24 np0005470441 kernel: PTP clock support registered
Oct  4 00:28:24 np0005470441 kernel: EDAC MC: Ver: 3.0.0
Oct  4 00:28:24 np0005470441 kernel: NetLabel: Initializing
Oct  4 00:28:24 np0005470441 kernel: NetLabel:  domain hash size = 128
Oct  4 00:28:24 np0005470441 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  4 00:28:24 np0005470441 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  4 00:28:24 np0005470441 kernel: PCI: Using ACPI for IRQ routing
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  4 00:28:24 np0005470441 kernel: vgaarb: loaded
Oct  4 00:28:24 np0005470441 kernel: clocksource: Switched to clocksource kvm-clock
Oct  4 00:28:24 np0005470441 kernel: VFS: Disk quotas dquot_6.6.0
Oct  4 00:28:24 np0005470441 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  4 00:28:24 np0005470441 kernel: pnp: PnP ACPI init
Oct  4 00:28:24 np0005470441 kernel: pnp: PnP ACPI: found 5 devices
Oct  4 00:28:24 np0005470441 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_INET protocol family
Oct  4 00:28:24 np0005470441 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  4 00:28:24 np0005470441 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_XDP protocol family
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  4 00:28:24 np0005470441 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  4 00:28:24 np0005470441 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  4 00:28:24 np0005470441 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 85865 usecs
Oct  4 00:28:24 np0005470441 kernel: PCI: CLS 0 bytes, default 64
Oct  4 00:28:24 np0005470441 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  4 00:28:24 np0005470441 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  4 00:28:24 np0005470441 kernel: ACPI: bus type thunderbolt registered
Oct  4 00:28:24 np0005470441 kernel: Trying to unpack rootfs image as initramfs...
Oct  4 00:28:24 np0005470441 kernel: Initialise system trusted keyrings
Oct  4 00:28:24 np0005470441 kernel: Key type blacklist registered
Oct  4 00:28:24 np0005470441 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  4 00:28:24 np0005470441 kernel: zbud: loaded
Oct  4 00:28:24 np0005470441 kernel: integrity: Platform Keyring initialized
Oct  4 00:28:24 np0005470441 kernel: integrity: Machine keyring initialized
Oct  4 00:28:24 np0005470441 kernel: Freeing initrd memory: 86104K
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_ALG protocol family
Oct  4 00:28:24 np0005470441 kernel: xor: automatically using best checksumming function   avx       
Oct  4 00:28:24 np0005470441 kernel: Key type asymmetric registered
Oct  4 00:28:24 np0005470441 kernel: Asymmetric key parser 'x509' registered
Oct  4 00:28:24 np0005470441 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  4 00:28:24 np0005470441 kernel: io scheduler mq-deadline registered
Oct  4 00:28:24 np0005470441 kernel: io scheduler kyber registered
Oct  4 00:28:24 np0005470441 kernel: io scheduler bfq registered
Oct  4 00:28:24 np0005470441 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  4 00:28:24 np0005470441 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  4 00:28:24 np0005470441 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  4 00:28:24 np0005470441 kernel: ACPI: button: Power Button [PWRF]
Oct  4 00:28:24 np0005470441 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  4 00:28:24 np0005470441 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  4 00:28:24 np0005470441 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  4 00:28:24 np0005470441 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  4 00:28:24 np0005470441 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  4 00:28:24 np0005470441 kernel: Non-volatile memory driver v1.3
Oct  4 00:28:24 np0005470441 kernel: rdac: device handler registered
Oct  4 00:28:24 np0005470441 kernel: hp_sw: device handler registered
Oct  4 00:28:24 np0005470441 kernel: emc: device handler registered
Oct  4 00:28:24 np0005470441 kernel: alua: device handler registered
Oct  4 00:28:24 np0005470441 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  4 00:28:24 np0005470441 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  4 00:28:24 np0005470441 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  4 00:28:24 np0005470441 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  4 00:28:24 np0005470441 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  4 00:28:24 np0005470441 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  4 00:28:24 np0005470441 kernel: usb usb1: Product: UHCI Host Controller
Oct  4 00:28:24 np0005470441 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  4 00:28:24 np0005470441 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  4 00:28:24 np0005470441 kernel: hub 1-0:1.0: USB hub found
Oct  4 00:28:24 np0005470441 kernel: hub 1-0:1.0: 2 ports detected
Oct  4 00:28:24 np0005470441 kernel: usbcore: registered new interface driver usbserial_generic
Oct  4 00:28:24 np0005470441 kernel: usbserial: USB Serial support registered for generic
Oct  4 00:28:24 np0005470441 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  4 00:28:24 np0005470441 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  4 00:28:24 np0005470441 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  4 00:28:24 np0005470441 kernel: mousedev: PS/2 mouse device common for all mice
Oct  4 00:28:24 np0005470441 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  4 00:28:24 np0005470441 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  4 00:28:24 np0005470441 kernel: rtc_cmos 00:04: registered as rtc0
Oct  4 00:28:24 np0005470441 kernel: rtc_cmos 00:04: setting system clock to 2025-10-04T04:28:23 UTC (1759552103)
Oct  4 00:28:24 np0005470441 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  4 00:28:24 np0005470441 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  4 00:28:24 np0005470441 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  4 00:28:24 np0005470441 kernel: usbcore: registered new interface driver usbhid
Oct  4 00:28:24 np0005470441 kernel: usbhid: USB HID core driver
Oct  4 00:28:24 np0005470441 kernel: drop_monitor: Initializing network drop monitor service
Oct  4 00:28:24 np0005470441 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  4 00:28:24 np0005470441 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  4 00:28:24 np0005470441 kernel: Initializing XFRM netlink socket
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_INET6 protocol family
Oct  4 00:28:24 np0005470441 kernel: Segment Routing with IPv6
Oct  4 00:28:24 np0005470441 kernel: NET: Registered PF_PACKET protocol family
Oct  4 00:28:24 np0005470441 kernel: mpls_gso: MPLS GSO support
Oct  4 00:28:24 np0005470441 kernel: IPI shorthand broadcast: enabled
Oct  4 00:28:24 np0005470441 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  4 00:28:24 np0005470441 kernel: AES CTR mode by8 optimization enabled
Oct  4 00:28:24 np0005470441 kernel: sched_clock: Marking stable (1252006600, 147862420)->(1475459430, -75590410)
Oct  4 00:28:24 np0005470441 kernel: registered taskstats version 1
Oct  4 00:28:24 np0005470441 kernel: Loading compiled-in X.509 certificates
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  4 00:28:24 np0005470441 kernel: Demotion targets for Node 0: null
Oct  4 00:28:24 np0005470441 kernel: page_owner is disabled
Oct  4 00:28:24 np0005470441 kernel: Key type .fscrypt registered
Oct  4 00:28:24 np0005470441 kernel: Key type fscrypt-provisioning registered
Oct  4 00:28:24 np0005470441 kernel: Key type big_key registered
Oct  4 00:28:24 np0005470441 kernel: Key type encrypted registered
Oct  4 00:28:24 np0005470441 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  4 00:28:24 np0005470441 kernel: Loading compiled-in module X.509 certificates
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  4 00:28:24 np0005470441 kernel: ima: Allocated hash algorithm: sha256
Oct  4 00:28:24 np0005470441 kernel: ima: No architecture policies found
Oct  4 00:28:24 np0005470441 kernel: evm: Initialising EVM extended attributes:
Oct  4 00:28:24 np0005470441 kernel: evm: security.selinux
Oct  4 00:28:24 np0005470441 kernel: evm: security.SMACK64 (disabled)
Oct  4 00:28:24 np0005470441 kernel: evm: security.SMACK64EXEC (disabled)
Oct  4 00:28:24 np0005470441 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  4 00:28:24 np0005470441 kernel: evm: security.SMACK64MMAP (disabled)
Oct  4 00:28:24 np0005470441 kernel: evm: security.apparmor (disabled)
Oct  4 00:28:24 np0005470441 kernel: evm: security.ima
Oct  4 00:28:24 np0005470441 kernel: evm: security.capability
Oct  4 00:28:24 np0005470441 kernel: evm: HMAC attrs: 0x1
Oct  4 00:28:24 np0005470441 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  4 00:28:24 np0005470441 kernel: Running certificate verification RSA selftest
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  4 00:28:24 np0005470441 kernel: Running certificate verification ECDSA selftest
Oct  4 00:28:24 np0005470441 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  4 00:28:24 np0005470441 kernel: clk: Disabling unused clocks
Oct  4 00:28:24 np0005470441 kernel: Freeing unused decrypted memory: 2028K
Oct  4 00:28:24 np0005470441 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  4 00:28:24 np0005470441 kernel: Write protecting the kernel read-only data: 30720k
Oct  4 00:28:24 np0005470441 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  4 00:28:24 np0005470441 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  4 00:28:24 np0005470441 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  4 00:28:24 np0005470441 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  4 00:28:24 np0005470441 kernel: usb 1-1: Manufacturer: QEMU
Oct  4 00:28:24 np0005470441 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  4 00:28:24 np0005470441 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  4 00:28:24 np0005470441 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  4 00:28:24 np0005470441 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  4 00:28:24 np0005470441 kernel: Run /init as init process
Oct  4 00:28:24 np0005470441 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  4 00:28:24 np0005470441 systemd: Detected virtualization kvm.
Oct  4 00:28:24 np0005470441 systemd: Detected architecture x86-64.
Oct  4 00:28:24 np0005470441 systemd: Running in initrd.
Oct  4 00:28:24 np0005470441 systemd: No hostname configured, using default hostname.
Oct  4 00:28:24 np0005470441 systemd: Hostname set to <localhost>.
Oct  4 00:28:24 np0005470441 systemd: Initializing machine ID from VM UUID.
Oct  4 00:28:24 np0005470441 systemd: Queued start job for default target Initrd Default Target.
Oct  4 00:28:24 np0005470441 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  4 00:28:24 np0005470441 systemd: Reached target Local Encrypted Volumes.
Oct  4 00:28:24 np0005470441 systemd: Reached target Initrd /usr File System.
Oct  4 00:28:24 np0005470441 systemd: Reached target Local File Systems.
Oct  4 00:28:24 np0005470441 systemd: Reached target Path Units.
Oct  4 00:28:24 np0005470441 systemd: Reached target Slice Units.
Oct  4 00:28:24 np0005470441 systemd: Reached target Swaps.
Oct  4 00:28:24 np0005470441 systemd: Reached target Timer Units.
Oct  4 00:28:24 np0005470441 systemd: Listening on D-Bus System Message Bus Socket.
Oct  4 00:28:24 np0005470441 systemd: Listening on Journal Socket (/dev/log).
Oct  4 00:28:24 np0005470441 systemd: Listening on Journal Socket.
Oct  4 00:28:24 np0005470441 systemd: Listening on udev Control Socket.
Oct  4 00:28:24 np0005470441 systemd: Listening on udev Kernel Socket.
Oct  4 00:28:24 np0005470441 systemd: Reached target Socket Units.
Oct  4 00:28:24 np0005470441 systemd: Starting Create List of Static Device Nodes...
Oct  4 00:28:24 np0005470441 systemd: Starting Journal Service...
Oct  4 00:28:24 np0005470441 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  4 00:28:24 np0005470441 systemd: Starting Apply Kernel Variables...
Oct  4 00:28:24 np0005470441 systemd: Starting Create System Users...
Oct  4 00:28:24 np0005470441 systemd: Starting Setup Virtual Console...
Oct  4 00:28:24 np0005470441 systemd: Finished Create List of Static Device Nodes.
Oct  4 00:28:24 np0005470441 systemd: Finished Apply Kernel Variables.
Oct  4 00:28:24 np0005470441 systemd: Finished Create System Users.
Oct  4 00:28:24 np0005470441 systemd-journald[305]: Journal started
Oct  4 00:28:24 np0005470441 systemd-journald[305]: Runtime Journal (/run/log/journal/2c012175598446419ddd886dcd6e4c6f) is 8.0M, max 153.5M, 145.5M free.
Oct  4 00:28:24 np0005470441 systemd-sysusers[310]: Creating group 'users' with GID 100.
Oct  4 00:28:24 np0005470441 systemd-sysusers[310]: Creating group 'dbus' with GID 81.
Oct  4 00:28:24 np0005470441 systemd-sysusers[310]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  4 00:28:24 np0005470441 systemd: Started Journal Service.
Oct  4 00:28:24 np0005470441 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  4 00:28:24 np0005470441 systemd[1]: Starting Create Volatile Files and Directories...
Oct  4 00:28:24 np0005470441 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  4 00:28:24 np0005470441 systemd[1]: Finished Create Volatile Files and Directories.
Oct  4 00:28:24 np0005470441 systemd[1]: Finished Setup Virtual Console.
Oct  4 00:28:24 np0005470441 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  4 00:28:24 np0005470441 systemd[1]: Starting dracut cmdline hook...
Oct  4 00:28:24 np0005470441 dracut-cmdline[325]: dracut-9 dracut-057-102.git20250818.el9
Oct  4 00:28:24 np0005470441 dracut-cmdline[325]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  4 00:28:24 np0005470441 systemd[1]: Finished dracut cmdline hook.
Oct  4 00:28:24 np0005470441 systemd[1]: Starting dracut pre-udev hook...
Oct  4 00:28:24 np0005470441 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  4 00:28:24 np0005470441 kernel: device-mapper: uevent: version 1.0.3
Oct  4 00:28:24 np0005470441 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  4 00:28:24 np0005470441 kernel: RPC: Registered named UNIX socket transport module.
Oct  4 00:28:24 np0005470441 kernel: RPC: Registered udp transport module.
Oct  4 00:28:24 np0005470441 kernel: RPC: Registered tcp transport module.
Oct  4 00:28:24 np0005470441 kernel: RPC: Registered tcp-with-tls transport module.
Oct  4 00:28:24 np0005470441 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  4 00:28:24 np0005470441 rpc.statd[442]: Version 2.5.4 starting
Oct  4 00:28:24 np0005470441 rpc.statd[442]: Initializing NSM state
Oct  4 00:28:24 np0005470441 rpc.idmapd[447]: Setting log level to 0
Oct  4 00:28:24 np0005470441 systemd[1]: Finished dracut pre-udev hook.
Oct  4 00:28:24 np0005470441 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  4 00:28:24 np0005470441 systemd-udevd[460]: Using default interface naming scheme 'rhel-9.0'.
Oct  4 00:28:24 np0005470441 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  4 00:28:24 np0005470441 systemd[1]: Starting dracut pre-trigger hook...
Oct  4 00:28:24 np0005470441 systemd[1]: Finished dracut pre-trigger hook.
Oct  4 00:28:24 np0005470441 systemd[1]: Starting Coldplug All udev Devices...
Oct  4 00:28:25 np0005470441 systemd[1]: Created slice Slice /system/modprobe.
Oct  4 00:28:25 np0005470441 systemd[1]: Starting Load Kernel Module configfs...
Oct  4 00:28:25 np0005470441 systemd[1]: Finished Coldplug All udev Devices.
Oct  4 00:28:25 np0005470441 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  4 00:28:25 np0005470441 systemd[1]: Finished Load Kernel Module configfs.
Oct  4 00:28:25 np0005470441 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target Network.
Oct  4 00:28:25 np0005470441 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  4 00:28:25 np0005470441 systemd[1]: Starting dracut initqueue hook...
Oct  4 00:28:25 np0005470441 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  4 00:28:25 np0005470441 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  4 00:28:25 np0005470441 kernel: vda: vda1
Oct  4 00:28:25 np0005470441 kernel: scsi host0: ata_piix
Oct  4 00:28:25 np0005470441 kernel: scsi host1: ata_piix
Oct  4 00:28:25 np0005470441 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  4 00:28:25 np0005470441 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  4 00:28:25 np0005470441 systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 00:28:25 np0005470441 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target Initrd Root Device.
Oct  4 00:28:25 np0005470441 systemd[1]: Mounting Kernel Configuration File System...
Oct  4 00:28:25 np0005470441 kernel: ata1: found unknown device (class 0)
Oct  4 00:28:25 np0005470441 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  4 00:28:25 np0005470441 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  4 00:28:25 np0005470441 systemd[1]: Mounted Kernel Configuration File System.
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target System Initialization.
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target Basic System.
Oct  4 00:28:25 np0005470441 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  4 00:28:25 np0005470441 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  4 00:28:25 np0005470441 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  4 00:28:25 np0005470441 systemd[1]: Finished dracut initqueue hook.
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  4 00:28:25 np0005470441 systemd[1]: Reached target Remote File Systems.
Oct  4 00:28:25 np0005470441 systemd[1]: Starting dracut pre-mount hook...
Oct  4 00:28:25 np0005470441 systemd[1]: Finished dracut pre-mount hook.
Oct  4 00:28:25 np0005470441 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  4 00:28:25 np0005470441 systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Oct  4 00:28:25 np0005470441 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  4 00:28:25 np0005470441 systemd[1]: Mounting /sysroot...
Oct  4 00:28:26 np0005470441 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  4 00:28:26 np0005470441 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  4 00:28:26 np0005470441 kernel: XFS (vda1): Ending clean mount
Oct  4 00:28:26 np0005470441 systemd[1]: Mounted /sysroot.
Oct  4 00:28:26 np0005470441 systemd[1]: Reached target Initrd Root File System.
Oct  4 00:28:26 np0005470441 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  4 00:28:26 np0005470441 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  4 00:28:26 np0005470441 systemd[1]: Reached target Initrd File Systems.
Oct  4 00:28:26 np0005470441 systemd[1]: Reached target Initrd Default Target.
Oct  4 00:28:26 np0005470441 systemd[1]: Starting dracut mount hook...
Oct  4 00:28:26 np0005470441 systemd[1]: Finished dracut mount hook.
Oct  4 00:28:26 np0005470441 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  4 00:28:26 np0005470441 rpc.idmapd[447]: exiting on signal 15
Oct  4 00:28:26 np0005470441 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  4 00:28:26 np0005470441 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Network.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Timer Units.
Oct  4 00:28:26 np0005470441 systemd[1]: dbus.socket: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Initrd Default Target.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Basic System.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Initrd Root Device.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Initrd /usr File System.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Path Units.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Remote File Systems.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Slice Units.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Socket Units.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target System Initialization.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Local File Systems.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Swaps.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut mount hook.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut pre-mount hook.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut initqueue hook.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Apply Kernel Variables.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Coldplug All udev Devices.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut pre-trigger hook.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Setup Virtual Console.
Oct  4 00:28:26 np0005470441 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Closed udev Control Socket.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Closed udev Kernel Socket.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut pre-udev hook.
Oct  4 00:28:26 np0005470441 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped dracut cmdline hook.
Oct  4 00:28:26 np0005470441 systemd[1]: Starting Cleanup udev Database...
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  4 00:28:26 np0005470441 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  4 00:28:26 np0005470441 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Stopped Create System Users.
Oct  4 00:28:26 np0005470441 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  4 00:28:26 np0005470441 systemd[1]: Finished Cleanup udev Database.
Oct  4 00:28:26 np0005470441 systemd[1]: Reached target Switch Root.
Oct  4 00:28:26 np0005470441 systemd[1]: Starting Switch Root...
Oct  4 00:28:26 np0005470441 systemd[1]: Switching root.
Oct  4 00:28:26 np0005470441 systemd-journald[305]: Journal stopped
Oct  4 00:28:27 np0005470441 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  4 00:28:27 np0005470441 kernel: audit: type=1404 audit(1759552106.647:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 00:28:27 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 00:28:27 np0005470441 kernel: audit: type=1403 audit(1759552106.817:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  4 00:28:27 np0005470441 systemd: Successfully loaded SELinux policy in 174.358ms.
Oct  4 00:28:27 np0005470441 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 31.090ms.
Oct  4 00:28:27 np0005470441 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  4 00:28:27 np0005470441 systemd: Detected virtualization kvm.
Oct  4 00:28:27 np0005470441 systemd: Detected architecture x86-64.
Oct  4 00:28:27 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 00:28:27 np0005470441 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd: Stopped Switch Root.
Oct  4 00:28:27 np0005470441 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  4 00:28:27 np0005470441 systemd: Created slice Slice /system/getty.
Oct  4 00:28:27 np0005470441 systemd: Created slice Slice /system/serial-getty.
Oct  4 00:28:27 np0005470441 systemd: Created slice Slice /system/sshd-keygen.
Oct  4 00:28:27 np0005470441 systemd: Created slice User and Session Slice.
Oct  4 00:28:27 np0005470441 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  4 00:28:27 np0005470441 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  4 00:28:27 np0005470441 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  4 00:28:27 np0005470441 systemd: Reached target Local Encrypted Volumes.
Oct  4 00:28:27 np0005470441 systemd: Stopped target Switch Root.
Oct  4 00:28:27 np0005470441 systemd: Stopped target Initrd File Systems.
Oct  4 00:28:27 np0005470441 systemd: Stopped target Initrd Root File System.
Oct  4 00:28:27 np0005470441 systemd: Reached target Local Integrity Protected Volumes.
Oct  4 00:28:27 np0005470441 systemd: Reached target Path Units.
Oct  4 00:28:27 np0005470441 systemd: Reached target rpc_pipefs.target.
Oct  4 00:28:27 np0005470441 systemd: Reached target Slice Units.
Oct  4 00:28:27 np0005470441 systemd: Reached target Swaps.
Oct  4 00:28:27 np0005470441 systemd: Reached target Local Verity Protected Volumes.
Oct  4 00:28:27 np0005470441 systemd: Listening on RPCbind Server Activation Socket.
Oct  4 00:28:27 np0005470441 systemd: Reached target RPC Port Mapper.
Oct  4 00:28:27 np0005470441 systemd: Listening on Process Core Dump Socket.
Oct  4 00:28:27 np0005470441 systemd: Listening on initctl Compatibility Named Pipe.
Oct  4 00:28:27 np0005470441 systemd: Listening on udev Control Socket.
Oct  4 00:28:27 np0005470441 systemd: Listening on udev Kernel Socket.
Oct  4 00:28:27 np0005470441 systemd: Mounting Huge Pages File System...
Oct  4 00:28:27 np0005470441 systemd: Mounting POSIX Message Queue File System...
Oct  4 00:28:27 np0005470441 systemd: Mounting Kernel Debug File System...
Oct  4 00:28:27 np0005470441 systemd: Mounting Kernel Trace File System...
Oct  4 00:28:27 np0005470441 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  4 00:28:27 np0005470441 systemd: Starting Create List of Static Device Nodes...
Oct  4 00:28:27 np0005470441 systemd: Starting Load Kernel Module configfs...
Oct  4 00:28:27 np0005470441 systemd: Starting Load Kernel Module drm...
Oct  4 00:28:27 np0005470441 systemd: Starting Load Kernel Module efi_pstore...
Oct  4 00:28:27 np0005470441 systemd: Starting Load Kernel Module fuse...
Oct  4 00:28:27 np0005470441 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  4 00:28:27 np0005470441 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd: Stopped File System Check on Root Device.
Oct  4 00:28:27 np0005470441 systemd: Stopped Journal Service.
Oct  4 00:28:27 np0005470441 systemd: Starting Journal Service...
Oct  4 00:28:27 np0005470441 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  4 00:28:27 np0005470441 systemd: Starting Generate network units from Kernel command line...
Oct  4 00:28:27 np0005470441 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  4 00:28:27 np0005470441 systemd: Starting Remount Root and Kernel File Systems...
Oct  4 00:28:27 np0005470441 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  4 00:28:27 np0005470441 systemd: Starting Apply Kernel Variables...
Oct  4 00:28:27 np0005470441 kernel: fuse: init (API version 7.37)
Oct  4 00:28:27 np0005470441 systemd: Starting Coldplug All udev Devices...
Oct  4 00:28:27 np0005470441 systemd: Mounted Huge Pages File System.
Oct  4 00:28:27 np0005470441 systemd: Mounted POSIX Message Queue File System.
Oct  4 00:28:27 np0005470441 systemd: Mounted Kernel Debug File System.
Oct  4 00:28:27 np0005470441 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  4 00:28:27 np0005470441 systemd: Mounted Kernel Trace File System.
Oct  4 00:28:27 np0005470441 systemd-journald[675]: Journal started
Oct  4 00:28:27 np0005470441 systemd-journald[675]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  4 00:28:27 np0005470441 systemd: Finished Create List of Static Device Nodes.
Oct  4 00:28:27 np0005470441 systemd[1]: Queued start job for default target Multi-User System.
Oct  4 00:28:27 np0005470441 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd: Started Journal Service.
Oct  4 00:28:27 np0005470441 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Load Kernel Module configfs.
Oct  4 00:28:27 np0005470441 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  4 00:28:27 np0005470441 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Load Kernel Module fuse.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Generate network units from Kernel command line.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Apply Kernel Variables.
Oct  4 00:28:27 np0005470441 kernel: ACPI: bus type drm_connector registered
Oct  4 00:28:27 np0005470441 systemd[1]: Mounting FUSE Control File System...
Oct  4 00:28:27 np0005470441 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Rebuild Hardware Database...
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  4 00:28:27 np0005470441 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Load/Save OS Random Seed...
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Create System Users...
Oct  4 00:28:27 np0005470441 systemd-journald[675]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  4 00:28:27 np0005470441 systemd-journald[675]: Received client request to flush runtime journal.
Oct  4 00:28:27 np0005470441 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Load Kernel Module drm.
Oct  4 00:28:27 np0005470441 systemd[1]: Mounted FUSE Control File System.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Load/Save OS Random Seed.
Oct  4 00:28:27 np0005470441 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Coldplug All udev Devices.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Create System Users.
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  4 00:28:27 np0005470441 systemd[1]: Reached target Preparation for Local File Systems.
Oct  4 00:28:27 np0005470441 systemd[1]: Reached target Local File Systems.
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  4 00:28:27 np0005470441 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  4 00:28:27 np0005470441 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  4 00:28:27 np0005470441 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Automatic Boot Loader Update...
Oct  4 00:28:27 np0005470441 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  4 00:28:27 np0005470441 systemd[1]: Starting Create Volatile Files and Directories...
Oct  4 00:28:27 np0005470441 bootctl[693]: Couldn't find EFI system partition, skipping.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Automatic Boot Loader Update.
Oct  4 00:28:27 np0005470441 systemd[1]: Finished Create Volatile Files and Directories.
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Security Auditing Service...
Oct  4 00:28:28 np0005470441 systemd[1]: Starting RPC Bind...
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Rebuild Journal Catalog...
Oct  4 00:28:28 np0005470441 auditd[699]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  4 00:28:28 np0005470441 auditd[699]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Rebuild Journal Catalog.
Oct  4 00:28:28 np0005470441 systemd[1]: Started RPC Bind.
Oct  4 00:28:28 np0005470441 augenrules[704]: /sbin/augenrules: No change
Oct  4 00:28:28 np0005470441 augenrules[719]: No rules
Oct  4 00:28:28 np0005470441 augenrules[719]: enabled 1
Oct  4 00:28:28 np0005470441 augenrules[719]: failure 1
Oct  4 00:28:28 np0005470441 augenrules[719]: pid 699
Oct  4 00:28:28 np0005470441 augenrules[719]: rate_limit 0
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_limit 8192
Oct  4 00:28:28 np0005470441 augenrules[719]: lost 0
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog 2
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_wait_time 60000
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_wait_time_actual 0
Oct  4 00:28:28 np0005470441 augenrules[719]: enabled 1
Oct  4 00:28:28 np0005470441 augenrules[719]: failure 1
Oct  4 00:28:28 np0005470441 augenrules[719]: pid 699
Oct  4 00:28:28 np0005470441 augenrules[719]: rate_limit 0
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_limit 8192
Oct  4 00:28:28 np0005470441 augenrules[719]: lost 0
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog 1
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_wait_time 60000
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_wait_time_actual 0
Oct  4 00:28:28 np0005470441 augenrules[719]: enabled 1
Oct  4 00:28:28 np0005470441 augenrules[719]: failure 1
Oct  4 00:28:28 np0005470441 augenrules[719]: pid 699
Oct  4 00:28:28 np0005470441 augenrules[719]: rate_limit 0
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_limit 8192
Oct  4 00:28:28 np0005470441 augenrules[719]: lost 0
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog 3
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_wait_time 60000
Oct  4 00:28:28 np0005470441 augenrules[719]: backlog_wait_time_actual 0
Oct  4 00:28:28 np0005470441 systemd[1]: Started Security Auditing Service.
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Rebuild Hardware Database.
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Update is Completed...
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Update is Completed.
Oct  4 00:28:28 np0005470441 systemd-udevd[728]: Using default interface naming scheme 'rhel-9.0'.
Oct  4 00:28:28 np0005470441 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  4 00:28:28 np0005470441 systemd[1]: Reached target System Initialization.
Oct  4 00:28:28 np0005470441 systemd[1]: Started dnf makecache --timer.
Oct  4 00:28:28 np0005470441 systemd[1]: Started Daily rotation of log files.
Oct  4 00:28:28 np0005470441 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  4 00:28:28 np0005470441 systemd[1]: Reached target Timer Units.
Oct  4 00:28:28 np0005470441 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  4 00:28:28 np0005470441 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  4 00:28:28 np0005470441 systemd[1]: Reached target Socket Units.
Oct  4 00:28:28 np0005470441 systemd[1]: Starting D-Bus System Message Bus...
Oct  4 00:28:28 np0005470441 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  4 00:28:28 np0005470441 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Load Kernel Module configfs...
Oct  4 00:28:28 np0005470441 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Load Kernel Module configfs.
Oct  4 00:28:28 np0005470441 systemd-udevd[734]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 00:28:28 np0005470441 systemd[1]: Started D-Bus System Message Bus.
Oct  4 00:28:28 np0005470441 systemd[1]: Reached target Basic System.
Oct  4 00:28:28 np0005470441 dbus-broker-lau[759]: Ready
Oct  4 00:28:28 np0005470441 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  4 00:28:28 np0005470441 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  4 00:28:28 np0005470441 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  4 00:28:28 np0005470441 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  4 00:28:28 np0005470441 systemd[1]: Starting NTP client/server...
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  4 00:28:28 np0005470441 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  4 00:28:28 np0005470441 systemd[1]: Starting IPv4 firewall with iptables...
Oct  4 00:28:28 np0005470441 systemd[1]: Started irqbalance daemon.
Oct  4 00:28:28 np0005470441 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  4 00:28:28 np0005470441 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  4 00:28:28 np0005470441 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  4 00:28:28 np0005470441 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  4 00:28:28 np0005470441 systemd[1]: Reached target sshd-keygen.target.
Oct  4 00:28:28 np0005470441 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  4 00:28:28 np0005470441 systemd[1]: Reached target User and Group Name Lookups.
Oct  4 00:28:28 np0005470441 systemd[1]: Starting User Login Management...
Oct  4 00:28:28 np0005470441 chronyd[798]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  4 00:28:28 np0005470441 chronyd[798]: Loaded 0 symmetric keys
Oct  4 00:28:28 np0005470441 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  4 00:28:28 np0005470441 chronyd[798]: Using right/UTC timezone to obtain leap second data
Oct  4 00:28:28 np0005470441 chronyd[798]: Loaded seccomp filter (level 2)
Oct  4 00:28:28 np0005470441 systemd[1]: Started NTP client/server.
Oct  4 00:28:28 np0005470441 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  4 00:28:28 np0005470441 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  4 00:28:28 np0005470441 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  4 00:28:28 np0005470441 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  4 00:28:28 np0005470441 systemd-logind[796]: New seat seat0.
Oct  4 00:28:28 np0005470441 systemd[1]: Started User Login Management.
Oct  4 00:28:28 np0005470441 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  4 00:28:28 np0005470441 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  4 00:28:28 np0005470441 kernel: Console: switching to colour dummy device 80x25
Oct  4 00:28:28 np0005470441 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  4 00:28:28 np0005470441 kernel: [drm] features: -context_init
Oct  4 00:28:28 np0005470441 kernel: [drm] number of scanouts: 1
Oct  4 00:28:28 np0005470441 kernel: [drm] number of cap sets: 0
Oct  4 00:28:28 np0005470441 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  4 00:28:28 np0005470441 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  4 00:28:28 np0005470441 kernel: Console: switching to colour frame buffer device 128x48
Oct  4 00:28:28 np0005470441 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  4 00:28:28 np0005470441 kernel: kvm_amd: TSC scaling supported
Oct  4 00:28:28 np0005470441 kernel: kvm_amd: Nested Virtualization enabled
Oct  4 00:28:28 np0005470441 kernel: kvm_amd: Nested Paging enabled
Oct  4 00:28:28 np0005470441 kernel: kvm_amd: LBR virtualization supported
Oct  4 00:28:28 np0005470441 iptables.init[780]: iptables: Applying firewall rules: [  OK  ]
Oct  4 00:28:28 np0005470441 systemd[1]: Finished IPv4 firewall with iptables.
Oct  4 00:28:30 np0005470441 cloud-init[838]: Cloud-init v. 24.4-7.el9 running 'init-local' at Sat, 04 Oct 2025 04:28:30 +0000. Up 7.73 seconds.
Oct  4 00:28:30 np0005470441 systemd[1]: run-cloud\x2dinit-tmp-tmpcjbdytyb.mount: Deactivated successfully.
Oct  4 00:28:30 np0005470441 systemd[1]: Starting Hostname Service...
Oct  4 00:28:30 np0005470441 systemd[1]: Started Hostname Service.
Oct  4 00:28:30 np0005470441 systemd-hostnamed[852]: Hostname set to <np0005470441.novalocal> (static)
Oct  4 00:28:30 np0005470441 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  4 00:28:30 np0005470441 systemd[1]: Reached target Preparation for Network.
Oct  4 00:28:30 np0005470441 systemd[1]: Starting Network Manager...
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7203] NetworkManager (version 1.54.1-1.el9) is starting... (boot:6827d816-bf4f-4d80-9923-db74c98231af)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7213] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7442] manager[0x562897908080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7513] hostname: hostname: using hostnamed
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7514] hostname: static hostname changed from (none) to "np0005470441.novalocal"
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7520] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7893] manager[0x562897908080]: rfkill: Wi-Fi hardware radio set enabled
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.7894] manager[0x562897908080]: rfkill: WWAN hardware radio set enabled
Oct  4 00:28:30 np0005470441 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8049] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8050] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8051] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8051] manager: Networking is enabled by state file
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8058] settings: Loaded settings plugin: keyfile (internal)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8100] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8147] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8182] dhcp: init: Using DHCP client 'internal'
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8188] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8213] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8269] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8280] device (lo): Activation: starting connection 'lo' (b59e5f5b-6646-4ca3-9ea8-9d0febc130b5)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8297] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8302] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 00:28:30 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 00:28:30 np0005470441 systemd[1]: Started Network Manager.
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8414] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8449] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  4 00:28:30 np0005470441 systemd[1]: Reached target Network.
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8496] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8500] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8504] device (eth0): carrier: link connected
Oct  4 00:28:30 np0005470441 systemd[1]: Starting Network Manager Wait Online...
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8511] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8570] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8577] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8581] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8581] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 00:28:30 np0005470441 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8593] manager: NetworkManager state is now CONNECTING
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8597] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8604] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8607] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  4 00:28:30 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8733] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8736] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  4 00:28:30 np0005470441 NetworkManager[856]: <info>  [1759552110.8743] device (lo): Activation: successful, device activated.
Oct  4 00:28:30 np0005470441 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  4 00:28:30 np0005470441 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  4 00:28:30 np0005470441 systemd[1]: Reached target NFS client services.
Oct  4 00:28:30 np0005470441 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  4 00:28:30 np0005470441 systemd[1]: Reached target Remote File Systems.
Oct  4 00:28:30 np0005470441 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4769] dhcp4 (eth0): state changed new lease, address=38.102.83.144
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4790] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4827] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4859] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4861] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4864] manager: NetworkManager state is now CONNECTED_SITE
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4869] device (eth0): Activation: successful, device activated.
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4875] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  4 00:28:33 np0005470441 NetworkManager[856]: <info>  [1759552113.4878] manager: startup complete
Oct  4 00:28:33 np0005470441 systemd[1]: Finished Network Manager Wait Online.
Oct  4 00:28:33 np0005470441 systemd[1]: Starting Cloud-init: Network Stage...
Oct  4 00:28:33 np0005470441 cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Sat, 04 Oct 2025 04:28:33 +0000. Up 11.50 seconds.
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.144         | 255.255.255.0 | global | fa:16:3e:d2:37:9c |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fed2:379c/64 |       .       |  link  | fa:16:3e:d2:37:9c |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Oct  4 00:28:33 np0005470441 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  4 00:28:35 np0005470441 cloud-init[922]: Generating public/private rsa key pair.
Oct  4 00:28:35 np0005470441 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  4 00:28:35 np0005470441 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  4 00:28:35 np0005470441 cloud-init[922]: The key fingerprint is:
Oct  4 00:28:35 np0005470441 cloud-init[922]: SHA256:QpnBJ3BFfP5ambZy4T+C1fGA1Pna1Ke9xHUftEUpS48 root@np0005470441.novalocal
Oct  4 00:28:35 np0005470441 cloud-init[922]: The key's randomart image is:
Oct  4 00:28:35 np0005470441 cloud-init[922]: +---[RSA 3072]----+
Oct  4 00:28:35 np0005470441 cloud-init[922]: |    .oo+o    . oo|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |     .o+o . .ooo.|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |      +o o ...*.+|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |     .    . .Eo+B|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |      . S  . +.X*|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |       .    O +o=|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |           B o. .|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |          + = .. |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |           o o.. |
Oct  4 00:28:35 np0005470441 cloud-init[922]: +----[SHA256]-----+
Oct  4 00:28:35 np0005470441 cloud-init[922]: Generating public/private ecdsa key pair.
Oct  4 00:28:35 np0005470441 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  4 00:28:35 np0005470441 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  4 00:28:35 np0005470441 cloud-init[922]: The key fingerprint is:
Oct  4 00:28:35 np0005470441 cloud-init[922]: SHA256:f+cJiJ/1EEPW2SHopbOkCrmYKSzj4t7RfXSxxqQxVH4 root@np0005470441.novalocal
Oct  4 00:28:35 np0005470441 cloud-init[922]: The key's randomart image is:
Oct  4 00:28:35 np0005470441 cloud-init[922]: +---[ECDSA 256]---+
Oct  4 00:28:35 np0005470441 cloud-init[922]: |         ..... . |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |        . .. o.o.|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |         o.+=Eo .|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |          *B+    |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |      . Soo==    |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |    .o. .+oo o   |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |.  .+.o.o.o = .  |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |+o.+.. ... + * . |
Oct  4 00:28:35 np0005470441 cloud-init[922]: |*=..      o   +  |
Oct  4 00:28:35 np0005470441 cloud-init[922]: +----[SHA256]-----+
Oct  4 00:28:35 np0005470441 cloud-init[922]: Generating public/private ed25519 key pair.
Oct  4 00:28:35 np0005470441 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  4 00:28:35 np0005470441 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  4 00:28:35 np0005470441 cloud-init[922]: The key fingerprint is:
Oct  4 00:28:35 np0005470441 cloud-init[922]: SHA256:L0UuOyVU427nFSWPrnjUcdbi+xhmhDveWcX9q/kNw+0 root@np0005470441.novalocal
Oct  4 00:28:35 np0005470441 cloud-init[922]: The key's randomart image is:
Oct  4 00:28:35 np0005470441 cloud-init[922]: +--[ED25519 256]--+
Oct  4 00:28:35 np0005470441 cloud-init[922]: |          o   . .|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |         o .   =.|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |        . o   = =|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |       . +   = B.|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |        S * + * +|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |         O = * oo|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |        + o * O +|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |         o o =.Xo|
Oct  4 00:28:35 np0005470441 cloud-init[922]: |            .o*oE|
Oct  4 00:28:35 np0005470441 cloud-init[922]: +----[SHA256]-----+
Oct  4 00:28:35 np0005470441 systemd[1]: Finished Cloud-init: Network Stage.
Oct  4 00:28:35 np0005470441 systemd[1]: Reached target Cloud-config availability.
Oct  4 00:28:35 np0005470441 systemd[1]: Reached target Network is Online.
Oct  4 00:28:35 np0005470441 systemd[1]: Starting Cloud-init: Config Stage...
Oct  4 00:28:35 np0005470441 systemd[1]: Starting Notify NFS peers of a restart...
Oct  4 00:28:35 np0005470441 systemd[1]: Starting System Logging Service...
Oct  4 00:28:35 np0005470441 systemd[1]: Starting OpenSSH server daemon...
Oct  4 00:28:35 np0005470441 sm-notify[1004]: Version 2.5.4 starting
Oct  4 00:28:35 np0005470441 systemd[1]: Starting Permit User Sessions...
Oct  4 00:28:35 np0005470441 systemd[1]: Started Notify NFS peers of a restart.
Oct  4 00:28:35 np0005470441 systemd[1]: Started OpenSSH server daemon.
Oct  4 00:28:35 np0005470441 systemd[1]: Finished Permit User Sessions.
Oct  4 00:28:35 np0005470441 systemd[1]: Started Command Scheduler.
Oct  4 00:28:35 np0005470441 systemd[1]: Started Getty on tty1.
Oct  4 00:28:35 np0005470441 systemd[1]: Started Serial Getty on ttyS0.
Oct  4 00:28:35 np0005470441 systemd[1]: Reached target Login Prompts.
Oct  4 00:28:35 np0005470441 systemd[1]: Started System Logging Service.
Oct  4 00:28:35 np0005470441 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Oct  4 00:28:35 np0005470441 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  4 00:28:35 np0005470441 systemd[1]: Reached target Multi-User System.
Oct  4 00:28:35 np0005470441 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  4 00:28:35 np0005470441 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  4 00:28:35 np0005470441 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  4 00:28:35 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 00:28:35 np0005470441 cloud-init[1036]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Sat, 04 Oct 2025 04:28:35 +0000. Up 13.38 seconds.
Oct  4 00:28:35 np0005470441 systemd[1]: Finished Cloud-init: Config Stage.
Oct  4 00:28:35 np0005470441 systemd[1]: Starting Cloud-init: Final Stage...
Oct  4 00:28:36 np0005470441 cloud-init[1040]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Sat, 04 Oct 2025 04:28:36 +0000. Up 13.80 seconds.
Oct  4 00:28:36 np0005470441 cloud-init[1042]: #############################################################
Oct  4 00:28:36 np0005470441 cloud-init[1043]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  4 00:28:36 np0005470441 cloud-init[1045]: 256 SHA256:f+cJiJ/1EEPW2SHopbOkCrmYKSzj4t7RfXSxxqQxVH4 root@np0005470441.novalocal (ECDSA)
Oct  4 00:28:36 np0005470441 cloud-init[1047]: 256 SHA256:L0UuOyVU427nFSWPrnjUcdbi+xhmhDveWcX9q/kNw+0 root@np0005470441.novalocal (ED25519)
Oct  4 00:28:36 np0005470441 cloud-init[1049]: 3072 SHA256:QpnBJ3BFfP5ambZy4T+C1fGA1Pna1Ke9xHUftEUpS48 root@np0005470441.novalocal (RSA)
Oct  4 00:28:36 np0005470441 cloud-init[1050]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  4 00:28:36 np0005470441 cloud-init[1051]: #############################################################
Oct  4 00:28:36 np0005470441 cloud-init[1040]: Cloud-init v. 24.4-7.el9 finished at Sat, 04 Oct 2025 04:28:36 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 14.02 seconds
Oct  4 00:28:36 np0005470441 systemd[1]: Finished Cloud-init: Final Stage.
Oct  4 00:28:36 np0005470441 systemd[1]: Reached target Cloud-init target.
Oct  4 00:28:36 np0005470441 systemd[1]: Startup finished in 1.680s (kernel) + 2.667s (initrd) + 9.744s (userspace) = 14.092s.
Oct  4 00:28:38 np0005470441 chronyd[798]: Selected source 167.160.187.179 (2.centos.pool.ntp.org)
Oct  4 00:28:38 np0005470441 chronyd[798]: System clock TAI offset set to 37 seconds
Oct  4 00:28:39 np0005470441 irqbalance[786]: Cannot change IRQ 25 affinity: Operation not permitted
Oct  4 00:28:39 np0005470441 irqbalance[786]: IRQ 25 affinity is now unmanaged
Oct  4 00:28:39 np0005470441 irqbalance[786]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  4 00:28:39 np0005470441 irqbalance[786]: IRQ 31 affinity is now unmanaged
Oct  4 00:28:39 np0005470441 irqbalance[786]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  4 00:28:39 np0005470441 irqbalance[786]: IRQ 28 affinity is now unmanaged
Oct  4 00:28:39 np0005470441 irqbalance[786]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  4 00:28:39 np0005470441 irqbalance[786]: IRQ 32 affinity is now unmanaged
Oct  4 00:28:39 np0005470441 irqbalance[786]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  4 00:28:39 np0005470441 irqbalance[786]: IRQ 30 affinity is now unmanaged
Oct  4 00:28:39 np0005470441 irqbalance[786]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  4 00:28:39 np0005470441 irqbalance[786]: IRQ 29 affinity is now unmanaged
Oct  4 00:28:43 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 00:28:52 np0005470441 systemd[1]: Created slice User Slice of UID 1000.
Oct  4 00:28:52 np0005470441 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  4 00:28:52 np0005470441 systemd-logind[796]: New session 1 of user zuul.
Oct  4 00:28:52 np0005470441 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  4 00:28:52 np0005470441 systemd[1]: Starting User Manager for UID 1000...
Oct  4 00:28:52 np0005470441 systemd[1059]: Queued start job for default target Main User Target.
Oct  4 00:28:52 np0005470441 systemd[1059]: Created slice User Application Slice.
Oct  4 00:28:52 np0005470441 systemd[1059]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  4 00:28:52 np0005470441 systemd[1059]: Started Daily Cleanup of User's Temporary Directories.
Oct  4 00:28:52 np0005470441 systemd[1059]: Reached target Paths.
Oct  4 00:28:52 np0005470441 systemd[1059]: Reached target Timers.
Oct  4 00:28:52 np0005470441 systemd[1059]: Starting D-Bus User Message Bus Socket...
Oct  4 00:28:52 np0005470441 systemd[1059]: Starting Create User's Volatile Files and Directories...
Oct  4 00:28:52 np0005470441 systemd[1059]: Listening on D-Bus User Message Bus Socket.
Oct  4 00:28:52 np0005470441 systemd[1059]: Reached target Sockets.
Oct  4 00:28:52 np0005470441 systemd[1059]: Finished Create User's Volatile Files and Directories.
Oct  4 00:28:52 np0005470441 systemd[1059]: Reached target Basic System.
Oct  4 00:28:52 np0005470441 systemd[1059]: Reached target Main User Target.
Oct  4 00:28:52 np0005470441 systemd[1059]: Startup finished in 197ms.
Oct  4 00:28:52 np0005470441 systemd[1]: Started User Manager for UID 1000.
Oct  4 00:28:52 np0005470441 systemd[1]: Started Session 1 of User zuul.
Oct  4 00:28:53 np0005470441 python3[1142]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 00:28:56 np0005470441 python3[1170]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 00:29:00 np0005470441 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  4 00:29:03 np0005470441 python3[1230]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 00:29:04 np0005470441 python3[1270]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  4 00:29:06 np0005470441 python3[1296]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDZV4ve57F/h7ghJUn26dbwyXMlywBropRmBI5ok4uwB+9tQ87OGglL79HXVPmUgxKbPmT6aDvHmE4MxYbc1tboT811PbU0SRTokswuSHt0rXZQ2ZdLCb20vQdVa+jRlyo8Jq3BhtalEiGPxNYi0J6fOkD4PlllQwRwK8yPlgS8t5dQkvdcYYamR2raf4OY+rmNrwPWTHTB9bClvnhrJIa+UyM0AaBcuhlqJjj+E3lsnDiuotgxqzHW15yfXtw9U6pVLMBZhh9CYyMFIWOeSnNfxtCUvj1vzDCWnWeMwZWBnMYEJn9R1OjDXwQcHgYiQLSHlkcUsDHoO9MzldIcOM1n8NDNNHSvVGBHn4ZAUxjTlT2T1tI8+AwPXcYlYzCb33H34a2/WRUZQUPQ/IHcCiI4c9/OnfhcOATwrjc4+ZSWngbzIxEKjyzCFBtuo1PYiglJFD1Bb7SuLfZ19PCGnOn1QrTzni8QbTG4BaReNKzicoR9iGR9wMwDHgmOuS9JaME= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:06 np0005470441 python3[1320]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:07 np0005470441 python3[1419]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:07 np0005470441 python3[1490]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759552147.13475-252-95065538208770/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=1c2dd4ee6b5c461fb460c0a837f0c29e_id_rsa follow=False checksum=33204cfe01c4c22c31a3f45df8d991e87c763812 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:08 np0005470441 python3[1613]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:08 np0005470441 python3[1684]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759552148.1600313-307-254254534215289/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=1c2dd4ee6b5c461fb460c0a837f0c29e_id_rsa.pub follow=False checksum=1f537cb52a38810e38b9b7a66d85733ae15e9304 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:10 np0005470441 python3[1732]: ansible-ping Invoked with data=pong
Oct  4 00:29:11 np0005470441 python3[1756]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 00:29:13 np0005470441 python3[1814]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  4 00:29:14 np0005470441 python3[1846]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:14 np0005470441 python3[1870]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:15 np0005470441 python3[1894]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:15 np0005470441 python3[1918]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:15 np0005470441 python3[1942]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:16 np0005470441 python3[1966]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:17 np0005470441 python3[1992]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:18 np0005470441 python3[2070]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:19 np0005470441 python3[2143]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759552158.144649-32-86773407345370/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:20 np0005470441 python3[2191]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:20 np0005470441 python3[2215]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:20 np0005470441 python3[2239]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:21 np0005470441 python3[2263]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:21 np0005470441 python3[2287]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:21 np0005470441 python3[2311]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:21 np0005470441 python3[2335]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:22 np0005470441 python3[2359]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:22 np0005470441 python3[2383]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:22 np0005470441 python3[2407]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:23 np0005470441 python3[2431]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:23 np0005470441 python3[2455]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:23 np0005470441 python3[2479]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:24 np0005470441 python3[2503]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:24 np0005470441 python3[2527]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:24 np0005470441 python3[2551]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:24 np0005470441 python3[2575]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:25 np0005470441 python3[2599]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:25 np0005470441 python3[2623]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:25 np0005470441 python3[2647]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:26 np0005470441 python3[2671]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:26 np0005470441 python3[2695]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:26 np0005470441 python3[2719]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:26 np0005470441 python3[2743]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:27 np0005470441 python3[2767]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:27 np0005470441 python3[2791]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:29:29 np0005470441 python3[2817]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  4 00:29:29 np0005470441 systemd[1]: Starting Time & Date Service...
Oct  4 00:29:29 np0005470441 systemd[1]: Started Time & Date Service.
Oct  4 00:29:29 np0005470441 systemd-timedated[2819]: Changed time zone to 'UTC' (UTC).
Oct  4 00:29:30 np0005470441 python3[2849]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:30 np0005470441 python3[2925]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:31 np0005470441 python3[2996]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759552170.3972101-252-34894747121268/source _original_basename=tmp5nf62cv9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:31 np0005470441 python3[3096]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:32 np0005470441 python3[3167]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759552171.3533814-303-204860786907058/source _original_basename=tmpmlgqe49b follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:32 np0005470441 python3[3269]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:33 np0005470441 python3[3342]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759552172.574848-382-121764835907439/source _original_basename=tmpd71av_2o follow=False checksum=d3787dbc1d919dd7098cc7939d07e9b9a9d1522d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:33 np0005470441 python3[3390]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:29:34 np0005470441 python3[3416]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:29:34 np0005470441 python3[3496]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:29:35 np0005470441 python3[3569]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759552174.3993397-453-4904078393733/source _original_basename=tmpyyqj5uiq follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:35 np0005470441 python3[3620]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-f017-9b65-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:29:36 np0005470441 python3[3648]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-f017-9b65-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  4 00:29:37 np0005470441 python3[3676]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:58 np0005470441 python3[3702]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:29:59 np0005470441 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  4 00:30:59 np0005470441 systemd-logind[796]: Session 1 logged out. Waiting for processes to exit.
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  4 00:31:06 np0005470441 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  4 00:31:06 np0005470441 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.7995] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  4 00:31:06 np0005470441 systemd-udevd[3708]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8205] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8231] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8235] device (eth1): carrier: link connected
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8238] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8244] policy: auto-activating connection 'Wired connection 1' (b7642b6c-6201-3f64-81e9-061049f4b21f)
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8250] device (eth1): Activation: starting connection 'Wired connection 1' (b7642b6c-6201-3f64-81e9-061049f4b21f)
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8251] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8254] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8258] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 00:31:06 np0005470441 NetworkManager[856]: <info>  [1759552266.8263] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  4 00:31:06 np0005470441 systemd[1059]: Starting Mark boot as successful...
Oct  4 00:31:06 np0005470441 systemd[1059]: Finished Mark boot as successful.
Oct  4 00:31:07 np0005470441 systemd-logind[796]: New session 3 of user zuul.
Oct  4 00:31:07 np0005470441 systemd[1]: Started Session 3 of User zuul.
Oct  4 00:31:08 np0005470441 python3[3740]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-1557-48cf-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:31:15 np0005470441 python3[3820]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:31:15 np0005470441 python3[3893]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759552274.8027997-155-214375490958087/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=7863861fdf2ab14e8503d1dc80a65b266aa655fb backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:31:15 np0005470441 python3[3943]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 00:31:16 np0005470441 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  4 00:31:16 np0005470441 systemd[1]: Stopped Network Manager Wait Online.
Oct  4 00:31:16 np0005470441 systemd[1]: Stopping Network Manager Wait Online...
Oct  4 00:31:16 np0005470441 systemd[1]: Stopping Network Manager...
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0129] caught SIGTERM, shutting down normally.
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0143] dhcp4 (eth0): canceled DHCP transaction
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0144] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0144] dhcp4 (eth0): state changed no lease
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0148] manager: NetworkManager state is now CONNECTING
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0265] dhcp4 (eth1): canceled DHCP transaction
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0266] dhcp4 (eth1): state changed no lease
Oct  4 00:31:16 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 00:31:16 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 00:31:16 np0005470441 NetworkManager[856]: <info>  [1759552276.0442] exiting (success)
Oct  4 00:31:16 np0005470441 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  4 00:31:16 np0005470441 systemd[1]: Stopped Network Manager.
Oct  4 00:31:16 np0005470441 systemd[1]: NetworkManager.service: Consumed 1.133s CPU time, 9.9M memory peak.
Oct  4 00:31:16 np0005470441 systemd[1]: Starting Network Manager...
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.0988] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:6827d816-bf4f-4d80-9923-db74c98231af)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.0989] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.1037] manager[0x55d180cba070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  4 00:31:16 np0005470441 systemd[1]: Starting Hostname Service...
Oct  4 00:31:16 np0005470441 systemd[1]: Started Hostname Service.
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.1986] hostname: hostname: using hostnamed
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.1988] hostname: static hostname changed from (none) to "np0005470441.novalocal"
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.1995] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2001] manager[0x55d180cba070]: rfkill: Wi-Fi hardware radio set enabled
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2001] manager[0x55d180cba070]: rfkill: WWAN hardware radio set enabled
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2037] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2038] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2038] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2039] manager: Networking is enabled by state file
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2042] settings: Loaded settings plugin: keyfile (internal)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2046] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2078] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2088] dhcp: init: Using DHCP client 'internal'
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2091] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2100] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2107] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2119] device (lo): Activation: starting connection 'lo' (b59e5f5b-6646-4ca3-9ea8-9d0febc130b5)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2127] device (eth0): carrier: link connected
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2133] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2140] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2141] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2151] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2159] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2169] device (eth1): carrier: link connected
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2174] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2183] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (b7642b6c-6201-3f64-81e9-061049f4b21f) (indicated)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2185] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2193] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2202] device (eth1): Activation: starting connection 'Wired connection 1' (b7642b6c-6201-3f64-81e9-061049f4b21f)
Oct  4 00:31:16 np0005470441 systemd[1]: Started Network Manager.
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2209] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2217] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2221] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2224] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2229] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2233] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2236] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2240] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2244] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2252] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2256] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2269] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2272] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2288] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2291] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  4 00:31:16 np0005470441 NetworkManager[3960]: <info>  [1759552276.2296] device (lo): Activation: successful, device activated.
Oct  4 00:31:16 np0005470441 systemd[1]: Starting Network Manager Wait Online...
Oct  4 00:31:16 np0005470441 python3[4009]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-1557-48cf-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.5929] dhcp4 (eth0): state changed new lease, address=38.102.83.144
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.5936] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.5997] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.6045] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.6047] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.6049] manager: NetworkManager state is now CONNECTED_SITE
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.6052] device (eth0): Activation: successful, device activated.
Oct  4 00:31:17 np0005470441 NetworkManager[3960]: <info>  [1759552277.6057] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  4 00:31:27 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 00:31:46 np0005470441 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.2868] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  4 00:32:01 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 00:32:01 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3199] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3207] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3234] device (eth1): Activation: successful, device activated.
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3249] manager: startup complete
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3261] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <warn>  [1759552321.3272] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3286] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 systemd[1]: Finished Network Manager Wait Online.
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3416] dhcp4 (eth1): canceled DHCP transaction
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3417] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3417] dhcp4 (eth1): state changed no lease
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3443] policy: auto-activating connection 'ci-private-network' (988752d1-f6dd-5210-8cc7-badebbd9e56f)
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3451] device (eth1): Activation: starting connection 'ci-private-network' (988752d1-f6dd-5210-8cc7-badebbd9e56f)
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3452] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3457] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3469] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.3485] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.4114] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.4118] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 00:32:01 np0005470441 NetworkManager[3960]: <info>  [1759552321.4128] device (eth1): Activation: successful, device activated.
Oct  4 00:32:11 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 00:32:16 np0005470441 systemd[1]: session-3.scope: Deactivated successfully.
Oct  4 00:32:16 np0005470441 systemd[1]: session-3.scope: Consumed 1.560s CPU time.
Oct  4 00:32:16 np0005470441 systemd-logind[796]: Session 3 logged out. Waiting for processes to exit.
Oct  4 00:32:16 np0005470441 systemd-logind[796]: Removed session 3.
Oct  4 00:32:53 np0005470441 systemd-logind[796]: New session 4 of user zuul.
Oct  4 00:32:53 np0005470441 systemd[1]: Started Session 4 of User zuul.
Oct  4 00:32:53 np0005470441 python3[4137]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:32:54 np0005470441 python3[4210]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759552373.2829797-365-16812809318373/source _original_basename=tmprmv1dit9 follow=False checksum=5b438df6df709f597d5681c365c5a12775c0f9e7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:32:57 np0005470441 systemd[1]: session-4.scope: Deactivated successfully.
Oct  4 00:32:57 np0005470441 systemd-logind[796]: Session 4 logged out. Waiting for processes to exit.
Oct  4 00:32:57 np0005470441 systemd-logind[796]: Removed session 4.
Oct  4 00:34:51 np0005470441 systemd[1059]: Created slice User Background Tasks Slice.
Oct  4 00:34:51 np0005470441 systemd[1059]: Starting Cleanup of User's Temporary Files and Directories...
Oct  4 00:34:51 np0005470441 systemd[1059]: Finished Cleanup of User's Temporary Files and Directories.
Oct  4 00:40:20 np0005470441 systemd-logind[796]: New session 5 of user zuul.
Oct  4 00:40:20 np0005470441 systemd[1]: Started Session 5 of User zuul.
Oct  4 00:40:20 np0005470441 python3[4270]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d951-a627-000000001cfc-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:40:20 np0005470441 python3[4299]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:40:21 np0005470441 python3[4325]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:40:21 np0005470441 python3[4351]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:40:21 np0005470441 python3[4377]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:40:21 np0005470441 python3[4403]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:40:21 np0005470441 python3[4403]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  4 00:40:22 np0005470441 python3[4429]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 00:40:22 np0005470441 systemd[1]: Reloading.
Oct  4 00:40:23 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 00:40:24 np0005470441 python3[4485]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  4 00:40:24 np0005470441 python3[4511]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:40:25 np0005470441 python3[4539]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:40:25 np0005470441 python3[4567]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:40:25 np0005470441 python3[4595]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:40:26 np0005470441 python3[4622]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-d951-a627-000000001d02-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:40:27 np0005470441 python3[4652]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 00:40:30 np0005470441 systemd[1]: session-5.scope: Deactivated successfully.
Oct  4 00:40:30 np0005470441 systemd[1]: session-5.scope: Consumed 3.476s CPU time.
Oct  4 00:40:30 np0005470441 systemd-logind[796]: Session 5 logged out. Waiting for processes to exit.
Oct  4 00:40:30 np0005470441 systemd-logind[796]: Removed session 5.
Oct  4 00:40:31 np0005470441 systemd-logind[796]: New session 6 of user zuul.
Oct  4 00:40:31 np0005470441 systemd[1]: Started Session 6 of User zuul.
Oct  4 00:40:32 np0005470441 python3[4689]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  4 00:41:01 np0005470441 kernel: SELinux:  Converting 363 SID table entries...
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 00:41:01 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 00:41:12 np0005470441 kernel: SELinux:  Converting 363 SID table entries...
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 00:41:12 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 00:41:24 np0005470441 kernel: SELinux:  Converting 363 SID table entries...
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 00:41:24 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 00:41:26 np0005470441 setsebool[4757]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  4 00:41:26 np0005470441 setsebool[4757]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  4 00:41:39 np0005470441 kernel: SELinux:  Converting 366 SID table entries...
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 00:41:39 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 00:42:00 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  4 00:42:00 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 00:42:00 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 00:42:00 np0005470441 systemd[1]: Reloading.
Oct  4 00:42:00 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 00:42:00 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 00:42:06 np0005470441 systemd[1]: Starting PackageKit Daemon...
Oct  4 00:42:06 np0005470441 systemd[1]: Starting Authorization Manager...
Oct  4 00:42:06 np0005470441 polkitd[7712]: Started polkitd version 0.117
Oct  4 00:42:06 np0005470441 systemd[1]: Started Authorization Manager.
Oct  4 00:42:06 np0005470441 systemd[1]: Started PackageKit Daemon.
Oct  4 00:42:39 np0005470441 python3[17629]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-b73c-6dea-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:42:40 np0005470441 kernel: evm: overlay not supported
Oct  4 00:42:41 np0005470441 systemd[1059]: Starting D-Bus User Message Bus...
Oct  4 00:42:41 np0005470441 dbus-broker-launch[18057]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  4 00:42:41 np0005470441 dbus-broker-launch[18057]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  4 00:42:41 np0005470441 systemd[1059]: Started D-Bus User Message Bus.
Oct  4 00:42:41 np0005470441 dbus-broker-lau[18057]: Ready
Oct  4 00:42:41 np0005470441 systemd[1059]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  4 00:42:41 np0005470441 systemd[1059]: Created slice Slice /user.
Oct  4 00:42:41 np0005470441 systemd[1059]: podman-17967.scope: unit configures an IP firewall, but not running as root.
Oct  4 00:42:41 np0005470441 systemd[1059]: (This warning is only shown for the first unit using IP firewalling.)
Oct  4 00:42:41 np0005470441 systemd[1059]: Started podman-17967.scope.
Oct  4 00:42:41 np0005470441 systemd[1059]: Started podman-pause-9e8d75db.scope.
Oct  4 00:42:43 np0005470441 python3[18667]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.120:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.120:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:42:43 np0005470441 systemd[1]: session-6.scope: Deactivated successfully.
Oct  4 00:42:43 np0005470441 systemd[1]: session-6.scope: Consumed 1min 5.217s CPU time.
Oct  4 00:42:43 np0005470441 systemd-logind[796]: Session 6 logged out. Waiting for processes to exit.
Oct  4 00:42:43 np0005470441 systemd-logind[796]: Removed session 6.
Oct  4 00:43:09 np0005470441 irqbalance[786]: Cannot change IRQ 27 affinity: Operation not permitted
Oct  4 00:43:09 np0005470441 irqbalance[786]: IRQ 27 affinity is now unmanaged
Oct  4 00:43:13 np0005470441 systemd-logind[796]: New session 7 of user zuul.
Oct  4 00:43:13 np0005470441 systemd[1]: Started Session 7 of User zuul.
Oct  4 00:43:14 np0005470441 python3[25483]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN7x6sQf+gSgTXFheY+WFMl+6Cyxh1obvRqS63UX/Tx0OL/a4dcWUzLvwKBfffmTv02VeiskLYo9d3VXbxCAZgM= zuul@np0005470439.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:43:14 np0005470441 python3[25565]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN7x6sQf+gSgTXFheY+WFMl+6Cyxh1obvRqS63UX/Tx0OL/a4dcWUzLvwKBfffmTv02VeiskLYo9d3VXbxCAZgM= zuul@np0005470439.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:43:15 np0005470441 python3[25752]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005470441.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  4 00:43:18 np0005470441 python3[26029]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBN7x6sQf+gSgTXFheY+WFMl+6Cyxh1obvRqS63UX/Tx0OL/a4dcWUzLvwKBfffmTv02VeiskLYo9d3VXbxCAZgM= zuul@np0005470439.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  4 00:43:18 np0005470441 python3[26176]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:43:19 np0005470441 python3[26305]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759552998.6585636-168-265264420477653/source _original_basename=tmpj06vlm52 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:43:20 np0005470441 python3[26398]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Oct  4 00:43:20 np0005470441 systemd[1]: Starting Hostname Service...
Oct  4 00:43:20 np0005470441 systemd[1]: Started Hostname Service.
Oct  4 00:43:20 np0005470441 systemd-hostnamed[26402]: Changed pretty hostname to 'compute-1'
Oct  4 00:43:20 np0005470441 systemd-hostnamed[26402]: Hostname set to <compute-1> (static)
Oct  4 00:43:20 np0005470441 NetworkManager[3960]: <info>  [1759553000.5898] hostname: static hostname changed from "np0005470441.novalocal" to "compute-1"
Oct  4 00:43:20 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 00:43:20 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 00:43:21 np0005470441 systemd[1]: session-7.scope: Deactivated successfully.
Oct  4 00:43:21 np0005470441 systemd[1]: session-7.scope: Consumed 2.510s CPU time.
Oct  4 00:43:21 np0005470441 systemd-logind[796]: Session 7 logged out. Waiting for processes to exit.
Oct  4 00:43:21 np0005470441 systemd-logind[796]: Removed session 7.
Oct  4 00:43:22 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 00:43:22 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 00:43:22 np0005470441 systemd[1]: man-db-cache-update.service: Consumed 1min 16.394s CPU time.
Oct  4 00:43:22 np0005470441 systemd[1]: run-r0d80641a2d684347babba68d90ef81a2.service: Deactivated successfully.
Oct  4 00:43:30 np0005470441 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  4 00:43:30 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 00:43:30 np0005470441 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  4 00:43:30 np0005470441 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  4 00:43:30 np0005470441 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  4 00:43:50 np0005470441 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  4 00:47:12 np0005470441 systemd[1]: packagekit.service: Deactivated successfully.
Oct  4 00:48:05 np0005470441 systemd-logind[796]: New session 8 of user zuul.
Oct  4 00:48:06 np0005470441 systemd[1]: Started Session 8 of User zuul.
Oct  4 00:48:06 np0005470441 python3[26664]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 00:48:08 np0005470441 python3[26780]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:09 np0005470441 python3[26853]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=delorean.repo follow=False checksum=4e4cce9745a4bbb6ec620562e5c0dcf170d2dc8d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:09 np0005470441 python3[26879]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:09 np0005470441 python3[26952]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:10 np0005470441 python3[26978]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:10 np0005470441 python3[27051]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:10 np0005470441 python3[27077]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:11 np0005470441 python3[27150]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:11 np0005470441 python3[27176]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:11 np0005470441 python3[27249]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:12 np0005470441 python3[27275]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:12 np0005470441 python3[27348]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:12 np0005470441 python3[27374]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  4 00:48:12 np0005470441 python3[27447]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759553288.1488254-30618-19261682800981/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=a914794b06504b638afc74af4fed2b143e3dbab3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 00:48:25 np0005470441 python3[27495]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 00:53:25 np0005470441 systemd-logind[796]: Session 8 logged out. Waiting for processes to exit.
Oct  4 00:53:25 np0005470441 systemd[1]: session-8.scope: Deactivated successfully.
Oct  4 00:53:25 np0005470441 systemd[1]: session-8.scope: Consumed 5.094s CPU time.
Oct  4 00:53:25 np0005470441 systemd-logind[796]: Removed session 8.
Oct  4 01:02:29 np0005470441 systemd-logind[796]: New session 9 of user zuul.
Oct  4 01:02:29 np0005470441 systemd[1]: Started Session 9 of User zuul.
Oct  4 01:02:30 np0005470441 python3.9[27676]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:02:31 np0005470441 python3.9[27857]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:02:42 np0005470441 systemd[1]: session-9.scope: Deactivated successfully.
Oct  4 01:02:42 np0005470441 systemd[1]: session-9.scope: Consumed 7.983s CPU time.
Oct  4 01:02:42 np0005470441 systemd-logind[796]: Session 9 logged out. Waiting for processes to exit.
Oct  4 01:02:42 np0005470441 systemd-logind[796]: Removed session 9.
Oct  4 01:02:48 np0005470441 systemd-logind[796]: New session 10 of user zuul.
Oct  4 01:02:48 np0005470441 systemd[1]: Started Session 10 of User zuul.
Oct  4 01:02:49 np0005470441 python3.9[28070]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:02:50 np0005470441 systemd[1]: session-10.scope: Deactivated successfully.
Oct  4 01:02:50 np0005470441 systemd-logind[796]: Session 10 logged out. Waiting for processes to exit.
Oct  4 01:02:50 np0005470441 systemd-logind[796]: Removed session 10.
Oct  4 01:03:06 np0005470441 systemd-logind[796]: New session 11 of user zuul.
Oct  4 01:03:06 np0005470441 systemd[1]: Started Session 11 of User zuul.
Oct  4 01:03:06 np0005470441 python3.9[28253]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  4 01:03:08 np0005470441 python3.9[28427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:03:08 np0005470441 python3.9[28579]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:03:10 np0005470441 python3.9[28732]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:03:10 np0005470441 python3.9[28884]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:03:11 np0005470441 python3.9[29036]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:03:12 np0005470441 python3.9[29159]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554191.1151304-178-75290987542373/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:03:13 np0005470441 python3.9[29311]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:03:13 np0005470441 python3.9[29467]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:03:14 np0005470441 python3.9[29617]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:03:19 np0005470441 irqbalance[786]: Cannot change IRQ 26 affinity: Operation not permitted
Oct  4 01:03:19 np0005470441 irqbalance[786]: IRQ 26 affinity is now unmanaged
Oct  4 01:03:20 np0005470441 python3.9[29872]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:03:20 np0005470441 python3.9[30022]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:03:22 np0005470441 python3.9[30176]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:03:23 np0005470441 python3.9[30334]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:03:24 np0005470441 python3.9[30418]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:04:11 np0005470441 systemd[1]: Reloading.
Oct  4 01:04:11 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:04:11 np0005470441 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  4 01:04:12 np0005470441 systemd[1]: Reloading.
Oct  4 01:04:12 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:04:12 np0005470441 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  4 01:04:12 np0005470441 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  4 01:04:12 np0005470441 systemd[1]: Reloading.
Oct  4 01:04:12 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:04:13 np0005470441 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  4 01:04:13 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:04:13 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:04:13 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:05:32 np0005470441 kernel: SELinux:  Converting 2713 SID table entries...
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 01:05:32 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 01:05:32 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  4 01:05:32 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:05:32 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:05:32 np0005470441 systemd[1]: Reloading.
Oct  4 01:05:32 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:05:33 np0005470441 systemd[1]: Starting dnf makecache...
Oct  4 01:05:33 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:05:33 np0005470441 dnf[31021]: Failed determining last makecache time.
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-barbican-42b4c41831408a8e323 114 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 173 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 systemd[1]: Starting PackageKit Daemon...
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-cinder-1c00d6490d88e436f26ef 157 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-python-stevedore-c4acc5639fd2329372142 178 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 systemd[1]: Started PackageKit Daemon.
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-python-cloudkitty-tests-tempest-3961dc 186 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-diskimage-builder-43381184423c185801b5 179 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 177 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-python-designate-tests-tempest-347fdbc 142 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-glance-1fd12c29b339f30fe823e 185 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 174 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-manila-3c01b7181572c95dac462 184 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-python-whitebox-neutron-tests-tempest- 181 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-octavia-ba397f07a7331190208c 174 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-watcher-c014f81a8647287f6dcc 212 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-edpm-image-builder-55ba53cf215b14ed95b 176 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 195 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-swift-dc98a8463506ac520c469a 180 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-python-tempestconf-8515371b7cceebd4282 186 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: delorean-openstack-heat-ui-013accbfd179753bc3f0 197 kB/s | 3.0 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: CentOS Stream 9 - BaseOS                         68 kB/s | 6.7 kB     00:00
Oct  4 01:05:33 np0005470441 dnf[31021]: CentOS Stream 9 - AppStream                      73 kB/s | 6.8 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: CentOS Stream 9 - CRB                            60 kB/s | 6.6 kB     00:00
Oct  4 01:05:34 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:05:34 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:05:34 np0005470441 systemd[1]: man-db-cache-update.service: Consumed 1.208s CPU time.
Oct  4 01:05:34 np0005470441 systemd[1]: run-r087e6f3921b243dd974bf15434f07428.service: Deactivated successfully.
Oct  4 01:05:34 np0005470441 dnf[31021]: CentOS Stream 9 - Extras packages                32 kB/s | 8.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: dlrn-antelope-testing                           150 kB/s | 3.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: dlrn-antelope-build-deps                        135 kB/s | 3.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: centos9-rabbitmq                                106 kB/s | 3.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: centos9-storage                                 110 kB/s | 3.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: centos9-opstools                                124 kB/s | 3.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: NFV SIG OpenvSwitch                             113 kB/s | 3.0 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: repo-setup-centos-appstream                     163 kB/s | 4.4 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: repo-setup-centos-baseos                        157 kB/s | 3.9 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: repo-setup-centos-highavailability              158 kB/s | 3.9 kB     00:00
Oct  4 01:05:34 np0005470441 dnf[31021]: repo-setup-centos-powertools                    125 kB/s | 4.3 kB     00:00
Oct  4 01:05:35 np0005470441 dnf[31021]: Extra Packages for Enterprise Linux 9 - x86_64  174 kB/s |  23 kB     00:00
Oct  4 01:05:35 np0005470441 dnf[31021]: Metadata cache created.
Oct  4 01:05:35 np0005470441 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  4 01:05:35 np0005470441 systemd[1]: Finished dnf makecache.
Oct  4 01:05:35 np0005470441 systemd[1]: dnf-makecache.service: Consumed 1.790s CPU time.
Oct  4 01:05:38 np0005470441 python3.9[31976]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:05:41 np0005470441 python3.9[32257]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  4 01:05:42 np0005470441 python3.9[32409]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  4 01:05:45 np0005470441 python3.9[32563]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:05:47 np0005470441 python3.9[32715]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  4 01:05:48 np0005470441 python3.9[32867]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:05:53 np0005470441 python3.9[33019]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:05:54 np0005470441 python3.9[33142]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554349.468109-640-126381057416197/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:05:55 np0005470441 python3.9[33294]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  4 01:05:56 np0005470441 python3.9[33447]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  4 01:05:56 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:05:57 np0005470441 python3.9[33606]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  4 01:05:58 np0005470441 python3.9[33766]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  4 01:05:59 np0005470441 python3.9[33919]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  4 01:06:00 np0005470441 python3.9[34077]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  4 01:06:01 np0005470441 python3.9[34229]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:06:03 np0005470441 python3.9[34382]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:04 np0005470441 python3.9[34534]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:06:05 np0005470441 python3.9[34657]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554364.1053739-926-199171150435845/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:06 np0005470441 python3.9[34809]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:06:06 np0005470441 systemd[1]: Starting Load Kernel Modules...
Oct  4 01:06:06 np0005470441 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  4 01:06:06 np0005470441 kernel: Bridge firewalling registered
Oct  4 01:06:06 np0005470441 systemd-modules-load[34813]: Inserted module 'br_netfilter'
Oct  4 01:06:06 np0005470441 systemd[1]: Finished Load Kernel Modules.
Oct  4 01:06:07 np0005470441 python3.9[34969]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:06:07 np0005470441 python3.9[35092]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554366.8073359-994-10639014398040/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:08 np0005470441 python3.9[35244]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:06:12 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:06:12 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:06:12 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:06:12 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:06:12 np0005470441 systemd[1]: Reloading.
Oct  4 01:06:12 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:06:12 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:06:15 np0005470441 python3.9[37205]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:06:16 np0005470441 python3.9[38316]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  4 01:06:16 np0005470441 python3.9[39104]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:06:17 np0005470441 python3.9[39442]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:18 np0005470441 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  4 01:06:18 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:06:18 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:06:18 np0005470441 systemd[1]: man-db-cache-update.service: Consumed 5.470s CPU time.
Oct  4 01:06:18 np0005470441 systemd[1]: run-r5f90881a73584ceea5dbf95e732f50ba.service: Deactivated successfully.
Oct  4 01:06:18 np0005470441 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  4 01:06:19 np0005470441 python3.9[39816]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:06:19 np0005470441 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  4 01:06:19 np0005470441 systemd[1]: tuned.service: Deactivated successfully.
Oct  4 01:06:19 np0005470441 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  4 01:06:19 np0005470441 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  4 01:06:19 np0005470441 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  4 01:06:20 np0005470441 python3.9[39977]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  4 01:06:24 np0005470441 python3.9[40129]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:06:24 np0005470441 systemd[1]: Reloading.
Oct  4 01:06:24 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:06:25 np0005470441 python3.9[40318]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:06:25 np0005470441 systemd[1]: Reloading.
Oct  4 01:06:25 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:06:26 np0005470441 python3.9[40506]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:27 np0005470441 python3.9[40659]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:27 np0005470441 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  4 01:06:28 np0005470441 python3.9[40812]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:30 np0005470441 python3.9[40974]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:31 np0005470441 python3.9[41127]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:06:31 np0005470441 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  4 01:06:31 np0005470441 systemd[1]: Stopped Apply Kernel Variables.
Oct  4 01:06:31 np0005470441 systemd[1]: Stopping Apply Kernel Variables...
Oct  4 01:06:31 np0005470441 systemd[1]: Starting Apply Kernel Variables...
Oct  4 01:06:31 np0005470441 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  4 01:06:31 np0005470441 systemd[1]: Finished Apply Kernel Variables.
Oct  4 01:06:31 np0005470441 systemd[1]: session-11.scope: Deactivated successfully.
Oct  4 01:06:31 np0005470441 systemd[1]: session-11.scope: Consumed 2min 18.073s CPU time.
Oct  4 01:06:31 np0005470441 systemd-logind[796]: Session 11 logged out. Waiting for processes to exit.
Oct  4 01:06:31 np0005470441 systemd-logind[796]: Removed session 11.
Oct  4 01:06:36 np0005470441 systemd-logind[796]: New session 12 of user zuul.
Oct  4 01:06:36 np0005470441 systemd[1]: Started Session 12 of User zuul.
Oct  4 01:06:37 np0005470441 python3.9[41310]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:06:39 np0005470441 python3.9[41464]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:06:40 np0005470441 python3.9[41620]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:42 np0005470441 python3.9[41771]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:06:43 np0005470441 python3.9[41927]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:06:44 np0005470441 python3.9[42011]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:06:46 np0005470441 python3.9[42164]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:06:47 np0005470441 python3.9[42335]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:06:48 np0005470441 python3.9[42487]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:06:48 np0005470441 podman[42488]: 2025-10-04 05:06:48.32284876 +0000 UTC m=+0.105715543 system refresh
Oct  4 01:06:49 np0005470441 python3.9[42651]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:06:49 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:06:49 np0005470441 python3.9[42774]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554408.5707655-288-234074107326331/.source.json follow=False _original_basename=podman_network_config.j2 checksum=f5916c4900045d715d7f2a5f05455ef4e3134d87 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:06:50 np0005470441 python3.9[42926]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:06:51 np0005470441 python3.9[43049]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554410.2060997-333-271093374214816/.source.conf follow=False _original_basename=registries.conf.j2 checksum=0a61421d1c4ad0a0032923f9c8c29ac0d59798cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:52 np0005470441 python3.9[43201]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:52 np0005470441 python3.9[43353]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:53 np0005470441 python3.9[43505]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:54 np0005470441 python3.9[43657]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:06:55 np0005470441 python3.9[43807]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:06:56 np0005470441 python3.9[43961]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:06:57 np0005470441 python3.9[44114]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:00 np0005470441 python3.9[44274]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:02 np0005470441 python3.9[44427]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:05 np0005470441 python3.9[44580]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:07 np0005470441 python3.9[44736]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:11 np0005470441 python3.9[44904]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:13 np0005470441 python3.9[45057]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:07:26 np0005470441 python3.9[45394]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:07:27 np0005470441 python3.9[45569]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:07:27 np0005470441 python3.9[45692]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759554446.6564336-750-77737455141549/.source.json _original_basename=.y0d8ordu follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:07:28 np0005470441 python3.9[45844]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:07:28 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:31 np0005470441 systemd[1]: var-lib-containers-storage-overlay-compat3195976320-lower\x2dmapped.mount: Deactivated successfully.
Oct  4 01:07:35 np0005470441 podman[45856]: 2025-10-04 05:07:35.492973675 +0000 UTC m=+6.528735147 image pull 13ffa098770f5095913e3dfecd601fec25536aa84ab5f90403cd9d7e0dc55d92 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  4 01:07:35 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:35 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:35 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:37 np0005470441 python3.9[46152]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:07:37 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:39 np0005470441 podman[46163]: 2025-10-04 05:07:39.260136453 +0000 UTC m=+2.151734664 image pull 3df028879be2d3446cef1cbbc8cb13789865aba0f4436e902e1d2605836cf14d quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  4 01:07:39 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:39 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:39 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:40 np0005470441 python3.9[46422]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:07:40 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:54 np0005470441 podman[46434]: 2025-10-04 05:07:54.776418469 +0000 UTC m=+14.231926425 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:07:54 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:54 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:54 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:55 np0005470441 python3.9[46738]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:07:55 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:57 np0005470441 podman[46751]: 2025-10-04 05:07:57.199992449 +0000 UTC m=+1.302170042 image pull 5652055d294fa12a03c8287ea23106a5617d67d9b2c36e3419473120055c6b9a quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  4 01:07:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:07:58 np0005470441 python3.9[46985]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:07:58 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:09 np0005470441 podman[46997]: 2025-10-04 05:08:09.075150767 +0000 UTC m=+10.625975679 image pull 2c237c6794a3227fe6b226ac969a4d71d1a5c1686381c5edb016d0bc4442832a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  4 01:08:09 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:09 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:09 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:11 np0005470441 python3.9[47256]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:08:11 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:18 np0005470441 podman[47268]: 2025-10-04 05:08:18.826258419 +0000 UTC m=+7.737958169 image pull 50efaea89142e519f31f4c7eaa86ed42b916b0efad6daebac6b52c254ced116c quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  4 01:08:18 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:18 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:18 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:19 np0005470441 python3.9[47523]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  4 01:08:21 np0005470441 podman[47534]: 2025-10-04 05:08:21.330000851 +0000 UTC m=+1.626026667 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  4 01:08:21 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:21 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:21 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:08:22 np0005470441 systemd[1]: session-12.scope: Deactivated successfully.
Oct  4 01:08:22 np0005470441 systemd[1]: session-12.scope: Consumed 1min 49.063s CPU time.
Oct  4 01:08:22 np0005470441 systemd-logind[796]: Session 12 logged out. Waiting for processes to exit.
Oct  4 01:08:22 np0005470441 systemd-logind[796]: Removed session 12.
Oct  4 01:08:28 np0005470441 systemd-logind[796]: New session 13 of user zuul.
Oct  4 01:08:28 np0005470441 systemd[1]: Started Session 13 of User zuul.
Oct  4 01:08:29 np0005470441 python3.9[47834]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:08:30 np0005470441 python3.9[47990]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  4 01:08:31 np0005470441 python3.9[48143]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  4 01:08:32 np0005470441 python3.9[48301]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  4 01:08:34 np0005470441 python3.9[48461]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:08:34 np0005470441 python3.9[48545]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:08:37 np0005470441 python3.9[48706]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:08:50 np0005470441 kernel: SELinux:  Converting 2725 SID table entries...
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 01:08:50 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 01:08:50 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  4 01:08:50 np0005470441 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  4 01:08:51 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:08:51 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:08:52 np0005470441 systemd[1]: Reloading.
Oct  4 01:08:52 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:08:52 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:08:52 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:08:52 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:08:52 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:08:52 np0005470441 systemd[1]: run-r2d67a01919b94b8bb1f197d7e679963b.service: Deactivated successfully.
Oct  4 01:08:54 np0005470441 python3.9[49816]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:08:54 np0005470441 systemd[1]: Reloading.
Oct  4 01:08:54 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:08:54 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:08:54 np0005470441 systemd[1]: Starting Open vSwitch Database Unit...
Oct  4 01:08:54 np0005470441 chown[49857]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  4 01:08:54 np0005470441 ovs-ctl[49862]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  4 01:08:54 np0005470441 ovs-ctl[49862]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  4 01:08:54 np0005470441 ovs-ctl[49862]: Starting ovsdb-server [  OK  ]
Oct  4 01:08:54 np0005470441 ovs-vsctl[49911]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  4 01:08:54 np0005470441 ovs-vsctl[49931]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"9c4f1832-26e9-4f83-989c-c9b104eab4b1\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  4 01:08:54 np0005470441 ovs-ctl[49862]: Configuring Open vSwitch system IDs [  OK  ]
Oct  4 01:08:54 np0005470441 ovs-ctl[49862]: Enabling remote OVSDB managers [  OK  ]
Oct  4 01:08:54 np0005470441 systemd[1]: Started Open vSwitch Database Unit.
Oct  4 01:08:54 np0005470441 ovs-vsctl[49937]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Oct  4 01:08:54 np0005470441 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  4 01:08:54 np0005470441 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  4 01:08:54 np0005470441 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  4 01:08:55 np0005470441 kernel: openvswitch: Open vSwitch switching datapath
Oct  4 01:08:55 np0005470441 ovs-ctl[49981]: Inserting openvswitch module [  OK  ]
Oct  4 01:08:55 np0005470441 ovs-ctl[49950]: Starting ovs-vswitchd [  OK  ]
Oct  4 01:08:55 np0005470441 ovs-vsctl[49999]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Oct  4 01:08:55 np0005470441 ovs-ctl[49950]: Enabling remote OVSDB managers [  OK  ]
Oct  4 01:08:55 np0005470441 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  4 01:08:55 np0005470441 systemd[1]: Starting Open vSwitch...
Oct  4 01:08:55 np0005470441 systemd[1]: Finished Open vSwitch.
Oct  4 01:08:56 np0005470441 python3.9[50151]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:08:57 np0005470441 python3.9[50303]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  4 01:08:58 np0005470441 kernel: SELinux:  Converting 2739 SID table entries...
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 01:08:58 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 01:08:59 np0005470441 python3.9[50458]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:09:00 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  4 01:09:00 np0005470441 python3.9[50616]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:09:02 np0005470441 python3.9[50769]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:09:04 np0005470441 python3.9[51056]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  4 01:09:04 np0005470441 python3.9[51206]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:09:05 np0005470441 python3.9[51360]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:09:07 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:09:07 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:09:07 np0005470441 systemd[1]: Reloading.
Oct  4 01:09:07 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:09:07 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:09:07 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:09:07 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:09:07 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:09:07 np0005470441 systemd[1]: run-r7a076906247f4731a578af6a1cb92695.service: Deactivated successfully.
Oct  4 01:09:08 np0005470441 python3.9[51677]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:09:08 np0005470441 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  4 01:09:08 np0005470441 systemd[1]: Stopped Network Manager Wait Online.
Oct  4 01:09:08 np0005470441 systemd[1]: Stopping Network Manager Wait Online...
Oct  4 01:09:08 np0005470441 systemd[1]: Stopping Network Manager...
Oct  4 01:09:08 np0005470441 NetworkManager[3960]: <info>  [1759554548.9116] caught SIGTERM, shutting down normally.
Oct  4 01:09:08 np0005470441 NetworkManager[3960]: <info>  [1759554548.9130] dhcp4 (eth0): canceled DHCP transaction
Oct  4 01:09:08 np0005470441 NetworkManager[3960]: <info>  [1759554548.9130] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  4 01:09:08 np0005470441 NetworkManager[3960]: <info>  [1759554548.9130] dhcp4 (eth0): state changed no lease
Oct  4 01:09:08 np0005470441 NetworkManager[3960]: <info>  [1759554548.9132] manager: NetworkManager state is now CONNECTED_SITE
Oct  4 01:09:08 np0005470441 NetworkManager[3960]: <info>  [1759554548.9193] exiting (success)
Oct  4 01:09:08 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 01:09:08 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 01:09:08 np0005470441 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  4 01:09:08 np0005470441 systemd[1]: Stopped Network Manager.
Oct  4 01:09:08 np0005470441 systemd[1]: NetworkManager.service: Consumed 14.529s CPU time, 4.0M memory peak, read 0B from disk, written 20.0K to disk.
Oct  4 01:09:08 np0005470441 systemd[1]: Starting Network Manager...
Oct  4 01:09:08 np0005470441 NetworkManager[51690]: <info>  [1759554548.9768] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:6827d816-bf4f-4d80-9923-db74c98231af)
Oct  4 01:09:08 np0005470441 NetworkManager[51690]: <info>  [1759554548.9771] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  4 01:09:08 np0005470441 NetworkManager[51690]: <info>  [1759554548.9824] manager[0x55ab820b6090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  4 01:09:09 np0005470441 systemd[1]: Starting Hostname Service...
Oct  4 01:09:09 np0005470441 systemd[1]: Started Hostname Service.
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0647] hostname: hostname: using hostnamed
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0648] hostname: static hostname changed from (none) to "compute-1"
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0653] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0659] manager[0x55ab820b6090]: rfkill: Wi-Fi hardware radio set enabled
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0659] manager[0x55ab820b6090]: rfkill: WWAN hardware radio set enabled
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0689] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0701] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0701] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0702] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0703] manager: Networking is enabled by state file
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0706] settings: Loaded settings plugin: keyfile (internal)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0710] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0736] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0746] dhcp: init: Using DHCP client 'internal'
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0749] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0754] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0759] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0767] device (lo): Activation: starting connection 'lo' (b59e5f5b-6646-4ca3-9ea8-9d0febc130b5)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0774] device (eth0): carrier: link connected
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0778] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0783] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0783] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0789] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0795] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0800] device (eth1): carrier: link connected
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0804] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0808] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (988752d1-f6dd-5210-8cc7-badebbd9e56f) (indicated)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0808] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0812] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0817] device (eth1): Activation: starting connection 'ci-private-network' (988752d1-f6dd-5210-8cc7-badebbd9e56f)
Oct  4 01:09:09 np0005470441 systemd[1]: Started Network Manager.
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0826] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0832] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0834] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0836] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0837] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0839] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0840] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0841] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0844] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0849] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0851] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0871] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0882] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0908] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0910] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0915] device (lo): Activation: successful, device activated.
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0919] dhcp4 (eth0): state changed new lease, address=38.102.83.144
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0925] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  4 01:09:09 np0005470441 systemd[1]: Starting Network Manager Wait Online...
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.0997] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1007] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1009] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1011] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1016] device (eth1): Activation: successful, device activated.
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1027] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1030] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1034] manager: NetworkManager state is now CONNECTED_SITE
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1039] device (eth0): Activation: successful, device activated.
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1044] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  4 01:09:09 np0005470441 NetworkManager[51690]: <info>  [1759554549.1047] manager: startup complete
Oct  4 01:09:09 np0005470441 systemd[1]: Finished Network Manager Wait Online.
Oct  4 01:09:10 np0005470441 python3.9[51903]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:09:15 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:09:15 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:09:15 np0005470441 systemd[1]: Reloading.
Oct  4 01:09:15 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:09:15 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:09:15 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:09:16 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:09:16 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:09:16 np0005470441 systemd[1]: run-r1febf91324af4f4ab4e88fa5f7538d4b.service: Deactivated successfully.
Oct  4 01:09:17 np0005470441 python3.9[52367]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:09:18 np0005470441 python3.9[52519]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:19 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 01:09:19 np0005470441 python3.9[52673]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:19 np0005470441 python3.9[52825]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:20 np0005470441 python3.9[52977]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:21 np0005470441 python3.9[53129]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:22 np0005470441 python3.9[53281]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:09:22 np0005470441 python3.9[53404]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554561.7060356-648-3857834264666/.source _original_basename=.p76v7w1l follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:23 np0005470441 python3.9[53556]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:24 np0005470441 python3.9[53708]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  4 01:09:25 np0005470441 python3.9[53860]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:27 np0005470441 python3.9[54287]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  4 01:09:28 np0005470441 ansible-async_wrapper.py[54462]: Invoked with j196204716394 300 /home/zuul/.ansible/tmp/ansible-tmp-1759554567.8365293-846-171152250750697/AnsiballZ_edpm_os_net_config.py _
Oct  4 01:09:28 np0005470441 ansible-async_wrapper.py[54465]: Starting module and watcher
Oct  4 01:09:28 np0005470441 ansible-async_wrapper.py[54465]: Start watching 54466 (300)
Oct  4 01:09:28 np0005470441 ansible-async_wrapper.py[54466]: Start module (54466)
Oct  4 01:09:28 np0005470441 ansible-async_wrapper.py[54462]: Return async_wrapper task started.
Oct  4 01:09:28 np0005470441 python3.9[54467]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  4 01:09:29 np0005470441 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  4 01:09:29 np0005470441 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  4 01:09:29 np0005470441 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  4 01:09:29 np0005470441 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  4 01:09:29 np0005470441 kernel: cfg80211: failed to load regulatory.db
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.5577] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.5601] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6165] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6167] audit: op="connection-add" uuid="2a4d4766-286c-4da7-9e7e-4556a65bb05d" name="br-ex-br" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6180] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6181] audit: op="connection-add" uuid="8b2d714f-ef32-4e45-91a9-5219224e8bff" name="br-ex-port" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6192] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6192] audit: op="connection-add" uuid="15694283-e80b-41d3-8d52-c7fa888e4f7e" name="eth1-port" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6204] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6205] audit: op="connection-add" uuid="bcafe861-a516-4b7a-8e7c-5028d7f0ee4d" name="vlan20-port" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6215] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6216] audit: op="connection-add" uuid="64867eb3-88de-4532-955c-703aad1f4ad0" name="vlan21-port" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6226] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6227] audit: op="connection-add" uuid="355a2d07-3f41-488b-a174-27c20a2dae6f" name="vlan22-port" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6245] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode,connection.autoconnect-priority,connection.timestamp,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6260] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6261] audit: op="connection-add" uuid="03959f08-5460-4c73-b877-6a2cabc749b7" name="br-ex-if" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6313] audit: op="connection-update" uuid="988752d1-f6dd-5210-8cc7-badebbd9e56f" name="ci-private-network" args="ovs-external-ids.data,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routes,connection.port-type,connection.controller,connection.master,connection.slave-type,connection.timestamp,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.routes,ipv4.addresses,ipv4.never-default,ovs-interface.type" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6329] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6330] audit: op="connection-add" uuid="d8b84da3-ff7e-47e0-a1c0-7310176fc0c9" name="vlan20-if" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6344] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6346] audit: op="connection-add" uuid="1c60bfeb-cdd3-4cde-ae22-be683943aa5d" name="vlan21-if" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6362] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6364] audit: op="connection-add" uuid="729620bb-6d5e-4ddb-9af1-044bea997230" name="vlan22-if" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6376] audit: op="connection-delete" uuid="b7642b6c-6201-3f64-81e9-061049f4b21f" name="Wired connection 1" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6388] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6399] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6402] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (2a4d4766-286c-4da7-9e7e-4556a65bb05d)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6402] audit: op="connection-activate" uuid="2a4d4766-286c-4da7-9e7e-4556a65bb05d" name="br-ex-br" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6403] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6407] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6410] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (8b2d714f-ef32-4e45-91a9-5219224e8bff)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6411] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6415] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6418] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (15694283-e80b-41d3-8d52-c7fa888e4f7e)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6420] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6424] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6427] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (bcafe861-a516-4b7a-8e7c-5028d7f0ee4d)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6428] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6433] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6437] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (64867eb3-88de-4532-955c-703aad1f4ad0)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6439] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6444] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6448] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (355a2d07-3f41-488b-a174-27c20a2dae6f)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6449] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6451] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6453] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6457] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6461] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6463] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (03959f08-5460-4c73-b877-6a2cabc749b7)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6464] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6466] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6468] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6469] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6469] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6478] device (eth1): disconnecting for new activation request.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6478] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6480] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6481] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6482] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6484] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6487] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6490] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (d8b84da3-ff7e-47e0-a1c0-7310176fc0c9)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6490] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6492] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6493] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6494] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6495] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6499] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6501] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (1c60bfeb-cdd3-4cde-ae22-be683943aa5d)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6502] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6504] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6505] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6506] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6508] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6511] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6514] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (729620bb-6d5e-4ddb-9af1-044bea997230)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6515] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6517] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6518] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6519] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6520] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6529] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-client-id,ipv4.dhcp-timeout" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6530] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6532] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6533] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6538] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6540] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6543] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6545] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6546] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 kernel: ovs-system: entered promiscuous mode
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6550] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6552] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6554] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6556] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6559] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6562] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6564] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6566] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 systemd-udevd[54473]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:09:30 np0005470441 kernel: Timeout policy base is empty
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6570] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6573] dhcp4 (eth0): canceled DHCP transaction
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6573] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6573] dhcp4 (eth0): state changed no lease
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6574] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6587] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6590] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54468 uid=0 result="fail" reason="Device is not activated"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6615] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6625] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6633] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6639] dhcp4 (eth0): state changed new lease, address=38.102.83.144
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6672] device (eth1): disconnecting for new activation request.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6673] audit: op="connection-activate" uuid="988752d1-f6dd-5210-8cc7-badebbd9e56f" name="ci-private-network" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6717] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54468 uid=0 result="success"
Oct  4 01:09:30 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6730] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6820] device (eth1): Activation: starting connection 'ci-private-network' (988752d1-f6dd-5210-8cc7-badebbd9e56f)
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6827] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6829] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6835] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6836] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6837] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6838] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6839] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6840] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6848] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6852] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6855] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6858] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6860] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6862] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6864] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6867] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6870] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6873] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6878] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6881] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6885] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6890] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  4 01:09:30 np0005470441 kernel: br-ex: entered promiscuous mode
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6895] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  4 01:09:30 np0005470441 kernel: vlan22: entered promiscuous mode
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6932] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6933] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.6937] device (eth1): Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7003] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  4 01:09:30 np0005470441 systemd-udevd[54472]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7013] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7035] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7036] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7042] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 kernel: vlan20: entered promiscuous mode
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7104] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7115] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 kernel: vlan21: entered promiscuous mode
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7133] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7134] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7140] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7212] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7224] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7241] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7243] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7248] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7649] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7661] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7676] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7677] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  4 01:09:30 np0005470441 NetworkManager[51690]: <info>  [1759554570.7683] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  4 01:09:31 np0005470441 NetworkManager[51690]: <info>  [1759554571.8261] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54468 uid=0 result="success"
Oct  4 01:09:31 np0005470441 NetworkManager[51690]: <info>  [1759554571.9589] checkpoint[0x55ab8208d950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  4 01:09:31 np0005470441 NetworkManager[51690]: <info>  [1759554571.9591] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.2127] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.2146] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.4062] audit: op="networking-control" arg="global-dns-configuration" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.4085] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.4105] audit: op="networking-control" arg="global-dns-configuration" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.4121] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 python3.9[54802]: ansible-ansible.legacy.async_status Invoked with jid=j196204716394.54462 mode=status _async_dir=/root/.ansible_async
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.5605] checkpoint[0x55ab8208da20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  4 01:09:32 np0005470441 NetworkManager[51690]: <info>  [1759554572.5618] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54468 uid=0 result="success"
Oct  4 01:09:32 np0005470441 ansible-async_wrapper.py[54466]: Module complete (54466)
Oct  4 01:09:33 np0005470441 ansible-async_wrapper.py[54465]: Done in kid B.
Oct  4 01:09:35 np0005470441 python3.9[54906]: ansible-ansible.legacy.async_status Invoked with jid=j196204716394.54462 mode=status _async_dir=/root/.ansible_async
Oct  4 01:09:36 np0005470441 python3.9[55006]: ansible-ansible.legacy.async_status Invoked with jid=j196204716394.54462 mode=cleanup _async_dir=/root/.ansible_async
Oct  4 01:09:37 np0005470441 python3.9[55158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:09:37 np0005470441 python3.9[55281]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554576.8052785-927-219028915610231/.source.returncode _original_basename=.mbb8dkuq follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:38 np0005470441 python3.9[55433]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:09:39 np0005470441 python3.9[55556]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554578.05885-975-149855313401018/.source.cfg _original_basename=.6qqdmzhk follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:09:39 np0005470441 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  4 01:09:39 np0005470441 python3.9[55712]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:09:39 np0005470441 systemd[1]: Reloading Network Manager...
Oct  4 01:09:39 np0005470441 NetworkManager[51690]: <info>  [1759554579.9447] audit: op="reload" arg="0" pid=55716 uid=0 result="success"
Oct  4 01:09:39 np0005470441 NetworkManager[51690]: <info>  [1759554579.9453] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  4 01:09:39 np0005470441 systemd[1]: Reloaded Network Manager.
Oct  4 01:09:40 np0005470441 systemd[1]: session-13.scope: Deactivated successfully.
Oct  4 01:09:40 np0005470441 systemd[1]: session-13.scope: Consumed 50.122s CPU time.
Oct  4 01:09:40 np0005470441 systemd-logind[796]: Session 13 logged out. Waiting for processes to exit.
Oct  4 01:09:40 np0005470441 systemd-logind[796]: Removed session 13.
Oct  4 01:09:45 np0005470441 systemd-logind[796]: New session 14 of user zuul.
Oct  4 01:09:45 np0005470441 systemd[1]: Started Session 14 of User zuul.
Oct  4 01:09:46 np0005470441 python3.9[55900]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:09:47 np0005470441 python3.9[56054]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:09:49 np0005470441 python3.9[56244]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:09:49 np0005470441 systemd[1]: session-14.scope: Deactivated successfully.
Oct  4 01:09:49 np0005470441 systemd[1]: session-14.scope: Consumed 2.256s CPU time.
Oct  4 01:09:49 np0005470441 systemd-logind[796]: Session 14 logged out. Waiting for processes to exit.
Oct  4 01:09:49 np0005470441 systemd-logind[796]: Removed session 14.
Oct  4 01:09:49 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 01:09:54 np0005470441 systemd-logind[796]: New session 15 of user zuul.
Oct  4 01:09:54 np0005470441 systemd[1]: Started Session 15 of User zuul.
Oct  4 01:09:55 np0005470441 python3.9[56426]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:09:56 np0005470441 python3.9[56580]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:09:57 np0005470441 python3.9[56736]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:09:58 np0005470441 python3.9[56821]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:10:00 np0005470441 python3.9[56974]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:10:02 np0005470441 python3.9[57166]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:03 np0005470441 python3.9[57318]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:10:03 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:10:04 np0005470441 python3.9[57481]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:04 np0005470441 python3.9[57559]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:05 np0005470441 python3.9[57711]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:05 np0005470441 python3.9[57789]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:06 np0005470441 python3.9[57941]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:07 np0005470441 python3.9[58093]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:08 np0005470441 python3.9[58245]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:08 np0005470441 python3.9[58397]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:09 np0005470441 python3.9[58549]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:10:11 np0005470441 python3.9[58702]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:10:12 np0005470441 python3.9[58856]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:10:13 np0005470441 python3.9[59008]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:10:14 np0005470441 python3.9[59160]: ansible-service_facts Invoked
Oct  4 01:10:14 np0005470441 network[59177]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:10:14 np0005470441 network[59178]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:10:14 np0005470441 network[59179]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:10:20 np0005470441 python3.9[59633]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:10:23 np0005470441 python3.9[59786]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  4 01:10:24 np0005470441 python3.9[59938]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:25 np0005470441 python3.9[60063]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554624.1214793-621-224755495298006/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:26 np0005470441 python3.9[60217]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:26 np0005470441 python3.9[60342]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554625.621917-667-208250725414461/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:28 np0005470441 python3.9[60496]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:30 np0005470441 python3.9[60650]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:10:31 np0005470441 python3.9[60734]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:10:32 np0005470441 python3.9[60888]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:10:33 np0005470441 python3.9[60972]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:10:33 np0005470441 chronyd[798]: chronyd exiting
Oct  4 01:10:33 np0005470441 systemd[1]: Stopping NTP client/server...
Oct  4 01:10:33 np0005470441 systemd[1]: chronyd.service: Deactivated successfully.
Oct  4 01:10:33 np0005470441 systemd[1]: Stopped NTP client/server.
Oct  4 01:10:33 np0005470441 systemd[1]: Starting NTP client/server...
Oct  4 01:10:33 np0005470441 chronyd[60981]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  4 01:10:33 np0005470441 chronyd[60981]: Frequency -28.460 +/- 0.096 ppm read from /var/lib/chrony/drift
Oct  4 01:10:33 np0005470441 chronyd[60981]: Loaded seccomp filter (level 2)
Oct  4 01:10:33 np0005470441 systemd[1]: Started NTP client/server.
Oct  4 01:10:34 np0005470441 systemd[1]: session-15.scope: Deactivated successfully.
Oct  4 01:10:34 np0005470441 systemd[1]: session-15.scope: Consumed 24.648s CPU time.
Oct  4 01:10:34 np0005470441 systemd-logind[796]: Session 15 logged out. Waiting for processes to exit.
Oct  4 01:10:34 np0005470441 systemd-logind[796]: Removed session 15.
Oct  4 01:10:41 np0005470441 systemd-logind[796]: New session 16 of user zuul.
Oct  4 01:10:41 np0005470441 systemd[1]: Started Session 16 of User zuul.
Oct  4 01:10:42 np0005470441 python3.9[61160]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:10:43 np0005470441 python3.9[61316]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:44 np0005470441 python3.9[61491]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:44 np0005470441 python3.9[61569]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.p6yoc0g3 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:46 np0005470441 python3.9[61721]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:46 np0005470441 python3.9[61844]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554645.545113-144-150587450528431/.source _original_basename=.92ifrxna follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:47 np0005470441 python3.9[61996]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:49 np0005470441 python3.9[62148]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:49 np0005470441 python3.9[62271]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554648.721845-216-106936835608205/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:50 np0005470441 python3.9[62423]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:50 np0005470441 python3.9[62546]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554649.9219277-216-203179618372480/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:10:52 np0005470441 python3.9[62698]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:53 np0005470441 python3.9[62850]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:53 np0005470441 python3.9[62973]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554652.6264968-327-107728320248118/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:54 np0005470441 python3.9[63125]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:55 np0005470441 python3.9[63248]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554654.1128943-372-197900242970479/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:56 np0005470441 python3.9[63400]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:10:56 np0005470441 systemd[1]: Reloading.
Oct  4 01:10:56 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:10:56 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:10:56 np0005470441 systemd[1]: Reloading.
Oct  4 01:10:56 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:10:56 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:10:56 np0005470441 systemd[1]: Starting EDPM Container Shutdown...
Oct  4 01:10:56 np0005470441 systemd[1]: Finished EDPM Container Shutdown.
Oct  4 01:10:57 np0005470441 python3.9[63626]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:58 np0005470441 python3.9[63749]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554657.18995-441-77806151664406/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:10:59 np0005470441 python3.9[63901]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:10:59 np0005470441 python3.9[64024]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554658.562384-487-48515010525002/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:11:00 np0005470441 python3.9[64176]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:11:00 np0005470441 systemd[1]: Reloading.
Oct  4 01:11:00 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:11:00 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:11:00 np0005470441 systemd[1]: Reloading.
Oct  4 01:11:00 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:11:00 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:11:01 np0005470441 systemd[1]: Starting Create netns directory...
Oct  4 01:11:01 np0005470441 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  4 01:11:01 np0005470441 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  4 01:11:01 np0005470441 systemd[1]: Finished Create netns directory.
Oct  4 01:11:01 np0005470441 python3.9[64402]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:11:01 np0005470441 network[64419]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:11:01 np0005470441 network[64420]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:11:01 np0005470441 network[64421]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:11:05 np0005470441 python3.9[64685]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:11:05 np0005470441 systemd[1]: Reloading.
Oct  4 01:11:05 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:11:05 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:11:05 np0005470441 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  4 01:11:05 np0005470441 iptables.init[64725]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  4 01:11:06 np0005470441 iptables.init[64725]: iptables: Flushing firewall rules: [  OK  ]
Oct  4 01:11:06 np0005470441 systemd[1]: iptables.service: Deactivated successfully.
Oct  4 01:11:06 np0005470441 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  4 01:11:06 np0005470441 python3.9[64921]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:11:07 np0005470441 python3.9[65075]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:11:07 np0005470441 systemd[1]: Reloading.
Oct  4 01:11:08 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:11:08 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:11:08 np0005470441 systemd[1]: Starting Netfilter Tables...
Oct  4 01:11:08 np0005470441 systemd[1]: Finished Netfilter Tables.
Oct  4 01:11:09 np0005470441 python3.9[65267]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:11:10 np0005470441 python3.9[65420]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:11:10 np0005470441 python3.9[65545]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554669.715671-693-222777581419340/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:11:11 np0005470441 python3.9[65696]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:11:37 np0005470441 systemd[1]: session-16.scope: Deactivated successfully.
Oct  4 01:11:37 np0005470441 systemd[1]: session-16.scope: Consumed 18.934s CPU time.
Oct  4 01:11:37 np0005470441 systemd-logind[796]: Session 16 logged out. Waiting for processes to exit.
Oct  4 01:11:37 np0005470441 systemd-logind[796]: Removed session 16.
Oct  4 01:11:50 np0005470441 systemd-logind[796]: New session 17 of user zuul.
Oct  4 01:11:50 np0005470441 systemd[1]: Started Session 17 of User zuul.
Oct  4 01:11:51 np0005470441 python3.9[65889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:11:52 np0005470441 python3.9[66045]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:11:53 np0005470441 python3.9[66220]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:11:53 np0005470441 python3.9[66298]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.6drg5xeq recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:11:54 np0005470441 python3.9[66450]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:11:55 np0005470441 python3.9[66528]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.0ke5ejwn recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:11:55 np0005470441 python3.9[66680]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:11:56 np0005470441 python3.9[66832]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:11:56 np0005470441 python3.9[66910]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:11:57 np0005470441 python3.9[67062]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:11:57 np0005470441 python3.9[67140]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:11:58 np0005470441 python3.9[67292]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:11:59 np0005470441 python3.9[67444]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:11:59 np0005470441 python3.9[67522]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:00 np0005470441 python3.9[67674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:01 np0005470441 python3.9[67752]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:02 np0005470441 python3.9[67904]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:12:02 np0005470441 systemd[1]: Reloading.
Oct  4 01:12:02 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:12:02 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:12:03 np0005470441 python3.9[68092]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:03 np0005470441 python3.9[68170]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:04 np0005470441 python3.9[68322]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:04 np0005470441 python3.9[68400]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:05 np0005470441 python3.9[68552]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:12:05 np0005470441 systemd[1]: Reloading.
Oct  4 01:12:05 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:12:05 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:12:05 np0005470441 systemd[1]: Starting Create netns directory...
Oct  4 01:12:05 np0005470441 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  4 01:12:05 np0005470441 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  4 01:12:05 np0005470441 systemd[1]: Finished Create netns directory.
Oct  4 01:12:06 np0005470441 python3.9[68743]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:12:06 np0005470441 network[68760]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:12:06 np0005470441 network[68761]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:12:06 np0005470441 network[68762]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:12:10 np0005470441 python3.9[69025]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:11 np0005470441 python3.9[69103]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:12 np0005470441 python3.9[69255]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:12 np0005470441 python3.9[69407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:13 np0005470441 python3.9[69530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554732.300327-609-234186537294594/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:14 np0005470441 python3.9[69682]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  4 01:12:14 np0005470441 systemd[1]: Starting Time & Date Service...
Oct  4 01:12:14 np0005470441 systemd[1]: Started Time & Date Service.
Oct  4 01:12:15 np0005470441 python3.9[69838]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:16 np0005470441 python3.9[69990]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:16 np0005470441 python3.9[70113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554735.8339455-714-151576295237028/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:17 np0005470441 python3.9[70265]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:18 np0005470441 python3.9[70388]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554737.0863984-759-21192933670856/.source.yaml _original_basename=.68epullj follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:19 np0005470441 python3.9[70540]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:19 np0005470441 python3.9[70663]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554738.5377657-804-242548169234416/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:20 np0005470441 python3.9[70815]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:12:21 np0005470441 python3.9[70968]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:12:21 np0005470441 python3[71121]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  4 01:12:22 np0005470441 python3.9[71273]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:23 np0005470441 python3.9[71396]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554742.2202966-921-232152149185201/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:23 np0005470441 python3.9[71548]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:24 np0005470441 python3.9[71671]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554743.4926748-966-171805123320193/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:25 np0005470441 python3.9[71823]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:25 np0005470441 python3.9[71946]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554744.805464-1011-210057820044628/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:26 np0005470441 python3.9[72098]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:27 np0005470441 python3.9[72221]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554746.0443285-1056-194474403647381/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:27 np0005470441 python3.9[72373]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:12:28 np0005470441 python3.9[72496]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554747.2834241-1101-276199253538098/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:29 np0005470441 python3.9[72648]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:29 np0005470441 python3.9[72800]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:12:30 np0005470441 python3.9[72959]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:31 np0005470441 python3.9[73112]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:32 np0005470441 python3.9[73264]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:33 np0005470441 python3.9[73416]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  4 01:12:33 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:12:33 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:12:33 np0005470441 python3.9[73570]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  4 01:12:34 np0005470441 systemd[1]: session-17.scope: Deactivated successfully.
Oct  4 01:12:34 np0005470441 systemd[1]: session-17.scope: Consumed 30.789s CPU time.
Oct  4 01:12:34 np0005470441 systemd-logind[796]: Session 17 logged out. Waiting for processes to exit.
Oct  4 01:12:34 np0005470441 systemd-logind[796]: Removed session 17.
Oct  4 01:12:40 np0005470441 systemd-logind[796]: New session 18 of user zuul.
Oct  4 01:12:40 np0005470441 systemd[1]: Started Session 18 of User zuul.
Oct  4 01:12:41 np0005470441 python3.9[73751]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  4 01:12:42 np0005470441 python3.9[73903]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:12:43 np0005470441 chronyd[60981]: Selected source 96.53.17.246 (pool.ntp.org)
Oct  4 01:12:43 np0005470441 python3.9[74055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:12:44 np0005470441 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  4 01:12:44 np0005470441 python3.9[74207]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2jNVrVuTFv3jZbMkTFAzN1Oee7Y+4cde0umzAn6mwXFLOhraDYyfHywMgEHgLSNpEYUxYzhzBDrvgCLa9+DHa2xBZLdjKy0lZwBlNSksCdTWeFkiW43E+o2yen1Io6cRO5A98jm7/x+TP7BQfOPOHBU6njxP52xmVtN7uuG6eKa2FpL/97so5Ik2Ic1ZMq8UT6ignVtLnSACzW/qkmgohdVgEZt8QLT4Dq9cdOe52PQIvvEH8N0mQTWptlo7553rmNeAXQkm9YAZeMwa4fqTPdYolv/a7EG7ltET7RJwMqqSdXzNP/pO+r6gTEbynKeBjMWfZbS3QmKUcNSxTSO+WfXoeUYkNLxH0DgWR6KCk+/JSl4oMJKUkr4p0hZ3yWz4XpEPfC6LgzXExPFvCEG8ADvcDR4KMGvmpXoJ6sqBNYb+MP8LhXaSO0c5qGryu8jyGl4yOaX88KSFhzQjNClzwx00x7sxjAzzCoGeCFB9xekoSsSjX+xlT32DwhCZ8IgM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE2weXbs4VXEGhm/ZfX/uZ2S5exbg31fyKV0//nJo5aq#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNxz7K9Zs6JdfNju2EzHUDsDDKXrXaHzRQlsq17agxI/aS9bQPJltpWP8RBWsbpgNzO58hM0Ux/Nuxsq0yADjUQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQChkBkXUgVX+c7qwP0oFzpKgOCGqsY90HCtqAg+E6TRMDz6Xr3/mR0XNyXARLcKyWFbox4L3sqTPr3fm+0oQv/Z9DNg8F5rYkPdXc+UdarF1kkOKL8DmJ6Na5OSvqTlJkyVqW5Kn5sQBa9d+n2iX2uk46GtbkyLooAX8jCWAB8Rpx3CQUOtgsbSMsH3CrEskRWMT7gQ5D8yhH/JOvzsmgJ5BgThzyhIvBS/93yn+WQPum6oTl50CVkgyW2XN5X4UD2Ug/X0rMJ4wNSUuhSXj+SQyncC2WBd8kvVy3UK5K0CRrE0WyuWnSWffKDRL4xZaWg7+5ydxOW7ryB2ASVsgzwSxEFAn2y9Ce8re6kH4J8DjlXYTToBWfzIMcNKcyC3bpHUU2XqNunWa80m3aCs9CKHdNmYC1xZ9XVlrFKroStnLzCJyEPfWTroJMwEyCj8uwUJBKAEtSuPZbvracGO3ZgHkt53ljUxtqMM8WNs6KlDIvVMYXjAvQeeR4g/jvNSnAE=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICmsgooe9B2583G4cx6bKWWFYb7ZF4f/A2seof1wweft#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEiTOIq8rSUMRdl4a7ohnDYwaD6Q30j97bKSRsKecyFRuQfyTllFjE8QMH0k5o6cXiONqugMPX59t+n/JdFDqBA=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDbCUM2Uq3ZmzIsJUNDNhcFz5Zh3NNRbFButdS0Q3TzbH0sxm2zUxuKCXjr7wlYGIdi7OeK/fML4iPYiGsftTAPIMbgdDSCdSU/EZ9qub5OCx1hlkpWmQZVOHjV02DYsske3Ei4OHk7grRddabuo2k5Rjm9FyUmrF9HjrODlz+UpGxt28pGJ+tdzTgsYryaRc4tT3C8QKjplRoaeUBvE25ci2eE7ZVA69hC4RhBgfw89CvytEGVyFVp8/qDAUu528QcUAtjRJ0YIGm07QKN3hR+m0o9WAkFGbo5A347QC5zw9HXy5XKIasdvrOvrYw/+dVQVAcBLbWWGpBWdltT5mfHzXWNLEBn37stkcpk85Fps+OvaL5HhJqotBOG1lG8CwUvlvg6D7JCO1rQ96PvW8LsCWUYjHm2BgiSUOL1ZROx/O5gAfFVGzXY9F26NmSKZemd+uBqkYYH+Cwtbz4GQBtY/FB3sXKw+xxRW3rJY7Lin9Hbn6TNPXfwmeZcNcop93M=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINSJ+ngFRJ0O7IlNCiSp2YIAHGMuYO7Q1tbgt5RXBfhV#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNCUF/78U//PlSkonChPeQx92cpy/xnyjibVSCPJIN3M4MFMLssKUwznIk6awwuIZRWdCY30Jp1NwX8YC2zntAs=#012 create=True mode=0644 path=/tmp/ansible.4lkfp8w9 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:45 np0005470441 python3.9[74361]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.4lkfp8w9' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:12:46 np0005470441 python3.9[74515]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.4lkfp8w9 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:47 np0005470441 systemd[1]: session-18.scope: Deactivated successfully.
Oct  4 01:12:47 np0005470441 systemd[1]: session-18.scope: Consumed 3.394s CPU time.
Oct  4 01:12:47 np0005470441 systemd-logind[796]: Session 18 logged out. Waiting for processes to exit.
Oct  4 01:12:47 np0005470441 systemd-logind[796]: Removed session 18.
Oct  4 01:12:52 np0005470441 systemd-logind[796]: New session 19 of user zuul.
Oct  4 01:12:52 np0005470441 systemd[1]: Started Session 19 of User zuul.
Oct  4 01:12:53 np0005470441 python3.9[74693]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:12:54 np0005470441 python3.9[74849]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  4 01:12:55 np0005470441 python3.9[75003]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:12:56 np0005470441 python3.9[75156]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:12:57 np0005470441 python3.9[75309]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:12:58 np0005470441 python3.9[75463]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:12:59 np0005470441 python3.9[75618]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:12:59 np0005470441 systemd[1]: session-19.scope: Deactivated successfully.
Oct  4 01:12:59 np0005470441 systemd[1]: session-19.scope: Consumed 4.314s CPU time.
Oct  4 01:12:59 np0005470441 systemd-logind[796]: Session 19 logged out. Waiting for processes to exit.
Oct  4 01:12:59 np0005470441 systemd-logind[796]: Removed session 19.
Oct  4 01:13:05 np0005470441 systemd-logind[796]: New session 20 of user zuul.
Oct  4 01:13:05 np0005470441 systemd[1]: Started Session 20 of User zuul.
Oct  4 01:13:06 np0005470441 python3.9[75796]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:13:07 np0005470441 python3.9[75952]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:13:08 np0005470441 python3.9[76036]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  4 01:13:10 np0005470441 python3.9[76187]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:13:12 np0005470441 python3.9[76338]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  4 01:13:12 np0005470441 python3.9[76488]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:13:13 np0005470441 python3.9[76638]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:13:14 np0005470441 systemd[1]: session-20.scope: Deactivated successfully.
Oct  4 01:13:14 np0005470441 systemd[1]: session-20.scope: Consumed 5.824s CPU time.
Oct  4 01:13:14 np0005470441 systemd-logind[796]: Session 20 logged out. Waiting for processes to exit.
Oct  4 01:13:14 np0005470441 systemd-logind[796]: Removed session 20.
Oct  4 01:13:19 np0005470441 systemd-logind[796]: New session 21 of user zuul.
Oct  4 01:13:19 np0005470441 systemd[1]: Started Session 21 of User zuul.
Oct  4 01:13:20 np0005470441 python3.9[76816]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:13:22 np0005470441 python3.9[76972]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:23 np0005470441 python3.9[77124]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:24 np0005470441 python3.9[77276]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:24 np0005470441 python3.9[77399]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554803.395544-155-111940621257026/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=369fc8a7837e8d080d109a8376f9b5076e070963 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:25 np0005470441 python3.9[77551]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:25 np0005470441 python3.9[77674]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554804.9414887-155-225972540043364/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=38d557f61317e57c9c3f96ead5b1e1a16744a07d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:26 np0005470441 python3.9[77826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:27 np0005470441 python3.9[77949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554806.1121998-155-279162487721402/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=033d11215192339ffaeffab4929220a1d70f5715 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:27 np0005470441 python3.9[78101]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:28 np0005470441 python3.9[78253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:29 np0005470441 python3.9[78405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:29 np0005470441 python3.9[78528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554808.773543-330-94374282130983/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=a926c599d72465f92a55624aac51df96b2dcf6f3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:30 np0005470441 python3.9[78680]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:30 np0005470441 python3.9[78803]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554809.9133916-330-27910131308451/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=92a1cabe49614150675fad32c0b8f4f7f1ef4f0f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:31 np0005470441 python3.9[78955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:32 np0005470441 python3.9[79078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554811.0482302-330-92302381447237/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=8535fa5899a1af76075dae125ade8bd3176e6f6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:32 np0005470441 python3.9[79230]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:33 np0005470441 python3.9[79382]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:34 np0005470441 python3.9[79534]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:34 np0005470441 python3.9[79657]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554813.6518912-507-89440050749302/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=1989c37b79aed39edd5f660c51c4564e2a32289c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:35 np0005470441 python3.9[79809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:35 np0005470441 python3.9[79932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554814.752374-507-146901156952145/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=8ba9d9f0c0409e9ccf3fe1fa5e9be6ae43408d0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:36 np0005470441 python3.9[80084]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:36 np0005470441 python3.9[80207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554815.9733186-507-107720285505473/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=abcdd8207255ef761f67493d00962dcb9b73df1c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:37 np0005470441 python3.9[80359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:38 np0005470441 python3.9[80511]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:38 np0005470441 python3.9[80663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:39 np0005470441 python3.9[80786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554818.3997233-683-258809705916271/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=3f9d11ed2b8485b7d0ad2966971baac5a1fa6931 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:40 np0005470441 python3.9[80938]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:40 np0005470441 python3.9[81061]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554819.5868006-683-78334871297181/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=8ba9d9f0c0409e9ccf3fe1fa5e9be6ae43408d0e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:41 np0005470441 python3.9[81213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:41 np0005470441 python3.9[81336]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554820.753657-683-131461967286357/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=1bbe92364b98fe2322d517c1a47ee33a3a5edf55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:43 np0005470441 python3.9[81488]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:43 np0005470441 python3.9[81640]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:44 np0005470441 python3.9[81763]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554823.236469-883-160879281758441/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:45 np0005470441 python3.9[81915]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:45 np0005470441 python3.9[82067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:46 np0005470441 python3.9[82190]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554825.1705716-956-200254599744716/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:46 np0005470441 python3.9[82342]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:47 np0005470441 python3.9[82494]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:48 np0005470441 python3.9[82617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554827.0704153-1028-266246658657616/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:48 np0005470441 python3.9[82769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:49 np0005470441 python3.9[82921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:49 np0005470441 python3.9[83044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554828.934466-1094-265585262077527/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:50 np0005470441 python3.9[83196]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:51 np0005470441 python3.9[83348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:51 np0005470441 python3.9[83471]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554830.812794-1166-17035714910805/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:52 np0005470441 python3.9[83623]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:53 np0005470441 python3.9[83775]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:53 np0005470441 python3.9[83898]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554832.8084414-1239-194664107114849/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:54 np0005470441 python3.9[84050]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:13:55 np0005470441 python3.9[84202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:13:55 np0005470441 python3.9[84325]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554834.6550674-1311-188851639384164/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1cfb011044a6c6eec701c2e46c504f5b77e7b6b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:13:57 np0005470441 systemd[1]: session-21.scope: Deactivated successfully.
Oct  4 01:13:57 np0005470441 systemd[1]: session-21.scope: Consumed 28.224s CPU time.
Oct  4 01:13:57 np0005470441 systemd-logind[796]: Session 21 logged out. Waiting for processes to exit.
Oct  4 01:13:57 np0005470441 systemd-logind[796]: Removed session 21.
Oct  4 01:14:03 np0005470441 systemd-logind[796]: New session 22 of user zuul.
Oct  4 01:14:03 np0005470441 systemd[1]: Started Session 22 of User zuul.
Oct  4 01:14:04 np0005470441 python3.9[84504]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:14:05 np0005470441 python3.9[84660]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:06 np0005470441 python3.9[84812]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:07 np0005470441 python3.9[84962]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:14:07 np0005470441 python3.9[85114]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  4 01:14:11 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct  4 01:14:11 np0005470441 python3.9[85270]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:14:12 np0005470441 python3.9[85354]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:14:15 np0005470441 python3.9[85507]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:14:16 np0005470441 python3[85662]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  4 01:14:16 np0005470441 python3.9[85814]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:17 np0005470441 python3.9[85966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:18 np0005470441 python3.9[86044]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:19 np0005470441 python3.9[86196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:19 np0005470441 python3.9[86274]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tbu2683i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:20 np0005470441 python3.9[86426]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:20 np0005470441 python3.9[86504]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:21 np0005470441 python3.9[86656]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:22 np0005470441 python3[86809]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  4 01:14:23 np0005470441 python3.9[86961]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:23 np0005470441 systemd[1]: packagekit.service: Deactivated successfully.
Oct  4 01:14:23 np0005470441 python3.9[87086]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554862.68335-432-70663479824898/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:24 np0005470441 python3.9[87238]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:25 np0005470441 python3.9[87363]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554864.42216-477-154106963584793/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:26 np0005470441 python3.9[87515]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:26 np0005470441 python3.9[87640]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554865.734114-522-13341508567348/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:27 np0005470441 python3.9[87792]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:28 np0005470441 python3.9[87917]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554867.171934-567-154069714475057/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:29 np0005470441 python3.9[88069]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:29 np0005470441 python3.9[88194]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759554868.5920215-612-252019703630503/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:30 np0005470441 python3.9[88346]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:31 np0005470441 python3.9[88498]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:32 np0005470441 python3.9[88653]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:33 np0005470441 python3.9[88805]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:33 np0005470441 python3.9[88958]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:14:34 np0005470441 python3.9[89112]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:35 np0005470441 python3.9[89267]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:36 np0005470441 python3.9[89417]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:14:37 np0005470441 python3.9[89570]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:37 np0005470441 ovs-vsctl[89571]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  4 01:14:38 np0005470441 python3.9[89723]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:39 np0005470441 python3.9[89878]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:14:39 np0005470441 ovs-vsctl[89879]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  4 01:14:40 np0005470441 python3.9[90029]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:14:40 np0005470441 python3.9[90183]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:41 np0005470441 python3.9[90335]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:42 np0005470441 python3.9[90413]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:42 np0005470441 python3.9[90565]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:43 np0005470441 python3.9[90643]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:43 np0005470441 python3.9[90795]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:44 np0005470441 python3.9[90947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:45 np0005470441 python3.9[91025]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:45 np0005470441 python3.9[91177]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:46 np0005470441 python3.9[91255]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:47 np0005470441 python3.9[91407]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:14:47 np0005470441 systemd[1]: Reloading.
Oct  4 01:14:47 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:14:47 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:14:48 np0005470441 python3.9[91596]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:48 np0005470441 python3.9[91674]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:49 np0005470441 python3.9[91826]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:49 np0005470441 python3.9[91904]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:50 np0005470441 python3.9[92056]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:14:50 np0005470441 systemd[1]: Reloading.
Oct  4 01:14:50 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:14:50 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:14:50 np0005470441 systemd[1]: Starting Create netns directory...
Oct  4 01:14:50 np0005470441 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  4 01:14:50 np0005470441 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  4 01:14:50 np0005470441 systemd[1]: Finished Create netns directory.
Oct  4 01:14:51 np0005470441 python3.9[92250]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:52 np0005470441 python3.9[92402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:52 np0005470441 python3.9[92525]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554891.9589698-1365-77099224851282/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:53 np0005470441 python3.9[92677]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:14:54 np0005470441 python3.9[92829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:14:55 np0005470441 python3.9[92952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554894.3132489-1440-201465990314457/.source.json _original_basename=.m4cgsgzs follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:56 np0005470441 python3.9[93104]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:14:58 np0005470441 python3.9[93531]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  4 01:14:59 np0005470441 python3.9[93683]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:15:00 np0005470441 python3.9[93835]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  4 01:15:00 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:15:01 np0005470441 python3[93997]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:15:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:15:02 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:15:02 np0005470441 podman[94029]: 2025-10-04 05:15:02.099700067 +0000 UTC m=+0.042242583 container create 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:15:02 np0005470441 podman[94029]: 2025-10-04 05:15:02.076352742 +0000 UTC m=+0.018895278 image pull 3df028879be2d3446cef1cbbc8cb13789865aba0f4436e902e1d2605836cf14d quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  4 01:15:02 np0005470441 python3[93997]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  4 01:15:02 np0005470441 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  4 01:15:03 np0005470441 python3.9[94216]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:15:04 np0005470441 python3.9[94370]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:04 np0005470441 python3.9[94446]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:15:05 np0005470441 python3.9[94597]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759554904.5551345-1704-187372430154295/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:05 np0005470441 python3.9[94673]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:15:05 np0005470441 systemd[1]: Reloading.
Oct  4 01:15:05 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:15:05 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:15:06 np0005470441 python3.9[94786]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:15:06 np0005470441 systemd[1]: Reloading.
Oct  4 01:15:06 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:15:06 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:15:06 np0005470441 systemd[1]: Starting ovn_controller container...
Oct  4 01:15:06 np0005470441 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  4 01:15:06 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:15:06 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0616e8fa79a4b519702cf8ab1fd9fdb0dfcaf8d17c4cc5b6aa9ce838ef1ab31/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  4 01:15:06 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15.
Oct  4 01:15:06 np0005470441 podman[94827]: 2025-10-04 05:15:06.940769536 +0000 UTC m=+0.123939691 container init 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:15:06 np0005470441 ovn_controller[94840]: + sudo -E kolla_set_configs
Oct  4 01:15:06 np0005470441 podman[94827]: 2025-10-04 05:15:06.963715931 +0000 UTC m=+0.146886056 container start 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:15:06 np0005470441 edpm-start-podman-container[94827]: ovn_controller
Oct  4 01:15:06 np0005470441 systemd[1]: Created slice User Slice of UID 0.
Oct  4 01:15:06 np0005470441 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  4 01:15:07 np0005470441 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  4 01:15:07 np0005470441 systemd[1]: Starting User Manager for UID 0...
Oct  4 01:15:07 np0005470441 edpm-start-podman-container[94826]: Creating additional drop-in dependency for "ovn_controller" (9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15)
Oct  4 01:15:07 np0005470441 podman[94849]: 2025-10-04 05:15:07.044138436 +0000 UTC m=+0.067690271 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:15:07 np0005470441 systemd[1]: 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15-30443f2929ae50e4.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:15:07 np0005470441 systemd[1]: 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15-30443f2929ae50e4.service: Failed with result 'exit-code'.
Oct  4 01:15:07 np0005470441 systemd[1]: Reloading.
Oct  4 01:15:07 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:15:07 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:15:07 np0005470441 systemd[94882]: Queued start job for default target Main User Target.
Oct  4 01:15:07 np0005470441 systemd[94882]: Created slice User Application Slice.
Oct  4 01:15:07 np0005470441 systemd[94882]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  4 01:15:07 np0005470441 systemd[94882]: Started Daily Cleanup of User's Temporary Directories.
Oct  4 01:15:07 np0005470441 systemd[94882]: Reached target Paths.
Oct  4 01:15:07 np0005470441 systemd[94882]: Reached target Timers.
Oct  4 01:15:07 np0005470441 systemd[94882]: Starting D-Bus User Message Bus Socket...
Oct  4 01:15:07 np0005470441 systemd[94882]: Starting Create User's Volatile Files and Directories...
Oct  4 01:15:07 np0005470441 systemd[94882]: Finished Create User's Volatile Files and Directories.
Oct  4 01:15:07 np0005470441 systemd[94882]: Listening on D-Bus User Message Bus Socket.
Oct  4 01:15:07 np0005470441 systemd[94882]: Reached target Sockets.
Oct  4 01:15:07 np0005470441 systemd[94882]: Reached target Basic System.
Oct  4 01:15:07 np0005470441 systemd[94882]: Reached target Main User Target.
Oct  4 01:15:07 np0005470441 systemd[94882]: Startup finished in 125ms.
Oct  4 01:15:07 np0005470441 systemd[1]: Started User Manager for UID 0.
Oct  4 01:15:07 np0005470441 systemd[1]: Started ovn_controller container.
Oct  4 01:15:07 np0005470441 systemd[1]: Started Session c1 of User root.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: INFO:__main__:Validating config file
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: INFO:__main__:Writing out command to execute
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: ++ cat /run_command
Oct  4 01:15:07 np0005470441 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + ARGS=
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + sudo kolla_copy_cacerts
Oct  4 01:15:07 np0005470441 systemd[1]: Started Session c2 of User root.
Oct  4 01:15:07 np0005470441 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + [[ ! -n '' ]]
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + . kolla_extend_start
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + umask 0022
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4176] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4181] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4190] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4195] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4197] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  4 01:15:07 np0005470441 kernel: br-int: entered promiscuous mode
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  4 01:15:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:07Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4423] manager: (ovn-90b35d-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  4 01:15:07 np0005470441 systemd-udevd[94977]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:15:07 np0005470441 kernel: genev_sys_6081: entered promiscuous mode
Oct  4 01:15:07 np0005470441 systemd-udevd[94978]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4603] device (genev_sys_6081): carrier: link connected
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.4606] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.5271] manager: (ovn-f13c8e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct  4 01:15:07 np0005470441 NetworkManager[51690]: <info>  [1759554907.8722] manager: (ovn-a27e69-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Oct  4 01:15:08 np0005470441 python3.9[95110]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:15:08 np0005470441 ovs-vsctl[95111]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  4 01:15:09 np0005470441 python3.9[95263]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:15:09 np0005470441 ovs-vsctl[95265]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  4 01:15:10 np0005470441 python3.9[95418]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:15:10 np0005470441 ovs-vsctl[95419]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  4 01:15:10 np0005470441 systemd[1]: session-22.scope: Deactivated successfully.
Oct  4 01:15:10 np0005470441 systemd[1]: session-22.scope: Consumed 44.789s CPU time.
Oct  4 01:15:10 np0005470441 systemd-logind[796]: Session 22 logged out. Waiting for processes to exit.
Oct  4 01:15:10 np0005470441 systemd-logind[796]: Removed session 22.
Oct  4 01:15:16 np0005470441 systemd-logind[796]: New session 24 of user zuul.
Oct  4 01:15:16 np0005470441 systemd[1]: Started Session 24 of User zuul.
Oct  4 01:15:17 np0005470441 python3.9[95597]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:15:17 np0005470441 systemd[1]: Stopping User Manager for UID 0...
Oct  4 01:15:17 np0005470441 systemd[94882]: Activating special unit Exit the Session...
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped target Main User Target.
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped target Basic System.
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped target Paths.
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped target Sockets.
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped target Timers.
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  4 01:15:17 np0005470441 systemd[94882]: Closed D-Bus User Message Bus Socket.
Oct  4 01:15:17 np0005470441 systemd[94882]: Stopped Create User's Volatile Files and Directories.
Oct  4 01:15:17 np0005470441 systemd[94882]: Removed slice User Application Slice.
Oct  4 01:15:17 np0005470441 systemd[94882]: Reached target Shutdown.
Oct  4 01:15:17 np0005470441 systemd[94882]: Finished Exit the Session.
Oct  4 01:15:17 np0005470441 systemd[94882]: Reached target Exit the Session.
Oct  4 01:15:17 np0005470441 systemd[1]: user@0.service: Deactivated successfully.
Oct  4 01:15:17 np0005470441 systemd[1]: Stopped User Manager for UID 0.
Oct  4 01:15:17 np0005470441 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  4 01:15:17 np0005470441 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  4 01:15:17 np0005470441 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  4 01:15:17 np0005470441 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  4 01:15:17 np0005470441 systemd[1]: Removed slice User Slice of UID 0.
Oct  4 01:15:18 np0005470441 python3.9[95754]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:19 np0005470441 python3.9[95906]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:19 np0005470441 python3.9[96058]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:20 np0005470441 python3.9[96210]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:21 np0005470441 python3.9[96362]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:21 np0005470441 python3.9[96512]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:15:22 np0005470441 python3.9[96664]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  4 01:15:24 np0005470441 python3.9[96814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:25 np0005470441 python3.9[96935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554923.7554767-219-254359607016794/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:25 np0005470441 python3.9[97086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:26 np0005470441 python3.9[97207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554925.5014274-264-211356066766585/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:27 np0005470441 python3.9[97359]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:15:28 np0005470441 python3.9[97443]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:15:31 np0005470441 python3.9[97596]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:15:31 np0005470441 python3.9[97749]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:32 np0005470441 python3.9[97870]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554931.4390275-375-2561020685955/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:33 np0005470441 python3.9[98020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:33 np0005470441 python3.9[98141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554932.5604868-375-89330873603794/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:35 np0005470441 python3.9[98291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:35 np0005470441 python3.9[98412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554934.6773667-507-83206759029377/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:36 np0005470441 python3.9[98562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:36 np0005470441 python3.9[98683]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554935.7416773-507-55075330672747/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:37Z|00025|memory|INFO|16128 kB peak resident set size after 29.8 seconds
Oct  4 01:15:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:15:37Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct  4 01:15:37 np0005470441 podman[98807]: 2025-10-04 05:15:37.245243953 +0000 UTC m=+0.084329617 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  4 01:15:37 np0005470441 python3.9[98843]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:15:38 np0005470441 python3.9[99010]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:38 np0005470441 python3.9[99162]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:39 np0005470441 python3.9[99240]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:39 np0005470441 python3.9[99392]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:40 np0005470441 python3.9[99470]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:41 np0005470441 python3.9[99622]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:41 np0005470441 python3.9[99774]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:42 np0005470441 python3.9[99852]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:43 np0005470441 python3.9[100004]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:43 np0005470441 python3.9[100082]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:44 np0005470441 python3.9[100234]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:15:44 np0005470441 systemd[1]: Reloading.
Oct  4 01:15:44 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:15:44 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:15:45 np0005470441 python3.9[100422]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:46 np0005470441 python3.9[100500]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:47 np0005470441 python3.9[100652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:47 np0005470441 python3.9[100730]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:48 np0005470441 python3.9[100882]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:15:48 np0005470441 systemd[1]: Reloading.
Oct  4 01:15:48 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:15:48 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:15:48 np0005470441 systemd[1]: Starting Create netns directory...
Oct  4 01:15:48 np0005470441 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  4 01:15:48 np0005470441 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  4 01:15:48 np0005470441 systemd[1]: Finished Create netns directory.
Oct  4 01:15:49 np0005470441 python3.9[101075]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:50 np0005470441 python3.9[101227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:51 np0005470441 python3.9[101350]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759554949.9266121-960-53000839006511/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:51 np0005470441 python3.9[101502]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:15:52 np0005470441 python3.9[101654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:15:53 np0005470441 python3.9[101777]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759554952.2400777-1035-5237224667736/.source.json _original_basename=.n4prqup0 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:53 np0005470441 python3.9[101929]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:15:56 np0005470441 python3.9[102356]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  4 01:15:57 np0005470441 python3.9[102508]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:15:58 np0005470441 python3.9[102660]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  4 01:15:59 np0005470441 python3[102838]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:16:00 np0005470441 podman[102873]: 2025-10-04 05:16:00.179897446 +0000 UTC m=+0.048774444 container create 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:16:00 np0005470441 podman[102873]: 2025-10-04 05:16:00.154123512 +0000 UTC m=+0.023000530 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:16:00 np0005470441 python3[102838]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:16:01 np0005470441 python3.9[103060]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:16:01 np0005470441 python3.9[103214]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:02 np0005470441 python3.9[103290]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:16:02 np0005470441 python3.9[103441]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759554962.3217037-1299-36324646912159/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:03 np0005470441 python3.9[103517]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:16:03 np0005470441 systemd[1]: Reloading.
Oct  4 01:16:03 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:16:03 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:16:04 np0005470441 python3.9[103628]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:04 np0005470441 systemd[1]: Reloading.
Oct  4 01:16:04 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:16:04 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:16:04 np0005470441 systemd[1]: Starting ovn_metadata_agent container...
Oct  4 01:16:04 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:16:04 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d6dee138a439ee3ac232255ed970007dfb9e7513734c3fc83735452caba142b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  4 01:16:04 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d6dee138a439ee3ac232255ed970007dfb9e7513734c3fc83735452caba142b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:16:04 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3.
Oct  4 01:16:04 np0005470441 podman[103669]: 2025-10-04 05:16:04.747628116 +0000 UTC m=+0.209062206 container init 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + sudo -E kolla_set_configs
Oct  4 01:16:04 np0005470441 podman[103669]: 2025-10-04 05:16:04.779590424 +0000 UTC m=+0.241024494 container start 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent)
Oct  4 01:16:04 np0005470441 edpm-start-podman-container[103669]: ovn_metadata_agent
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Validating config file
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Copying service configuration files
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Writing out command to execute
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: ++ cat /run_command
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + CMD=neutron-ovn-metadata-agent
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + ARGS=
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + sudo kolla_copy_cacerts
Oct  4 01:16:04 np0005470441 edpm-start-podman-container[103668]: Creating additional drop-in dependency for "ovn_metadata_agent" (60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3)
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + [[ ! -n '' ]]
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + . kolla_extend_start
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + umask 0022
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: + exec neutron-ovn-metadata-agent
Oct  4 01:16:04 np0005470441 ovn_metadata_agent[103684]: Running command: 'neutron-ovn-metadata-agent'
Oct  4 01:16:04 np0005470441 systemd[1]: Reloading.
Oct  4 01:16:04 np0005470441 podman[103691]: 2025-10-04 05:16:04.86861222 +0000 UTC m=+0.079939872 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct  4 01:16:04 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:16:04 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:16:05 np0005470441 systemd[1]: Started ovn_metadata_agent container.
Oct  4 01:16:06 np0005470441 systemd[1]: session-24.scope: Deactivated successfully.
Oct  4 01:16:06 np0005470441 systemd[1]: session-24.scope: Consumed 33.763s CPU time.
Oct  4 01:16:06 np0005470441 systemd-logind[796]: Session 24 logged out. Waiting for processes to exit.
Oct  4 01:16:06 np0005470441 systemd-logind[796]: Removed session 24.
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.675 103689 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.676 103689 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.676 103689 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.676 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.677 103689 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.678 103689 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.679 103689 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.680 103689 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.681 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.682 103689 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.683 103689 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.684 103689 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.685 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.686 103689 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.687 103689 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.688 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.689 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.690 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.691 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.692 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.693 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.694 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.695 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.696 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.697 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.698 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.699 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.700 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.701 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.702 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.703 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.704 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.705 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.706 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.707 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.708 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.709 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.710 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.711 103689 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.711 103689 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.720 103689 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.720 103689 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.720 103689 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.721 103689 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.721 103689 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.734 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 9c4f1832-26e9-4f83-989c-c9b104eab4b1 (UUID: 9c4f1832-26e9-4f83-989c-c9b104eab4b1) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.762 103689 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.762 103689 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.762 103689 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.763 103689 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.769 103689 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.775 103689 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.780 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '9c4f1832-26e9-4f83-989c-c9b104eab4b1'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], external_ids={}, name=9c4f1832-26e9-4f83-989c-c9b104eab4b1, nb_cfg_timestamp=1759554915439, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.781 103689 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f5e0101dbb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.782 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.782 103689 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.783 103689 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.783 103689 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.787 103689 DEBUG oslo_service.service [-] Started child 103796 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.790 103689 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpks0f6ikz/privsep.sock']#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.791 103796 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-168365'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.815 103796 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.815 103796 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.816 103796 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.820 103796 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.827 103796 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  4 01:16:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:06.834 103796 INFO eventlet.wsgi.server [-] (103796) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  4 01:16:07 np0005470441 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.435 103689 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.436 103689 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpks0f6ikz/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.324 103801 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.328 103801 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.330 103801 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.330 103801 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103801#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.438 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[f906289e-7007-48fe-83e7-9c6bcfed9adf]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.942 103801 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.942 103801 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:16:07 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:07.942 103801 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:16:08 np0005470441 podman[103806]: 2025-10-04 05:16:08.336331565 +0000 UTC m=+0.084229353 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller)
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.506 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[aaeeb27e-ef75-4234-b6ad-429e5e6d21af]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.513 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, column=external_ids, values=({'neutron:ovn-metadata-id': '2c5fced6-7c61-5810-91f5-d8f716bc55ef'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.525 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.537 103689 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.537 103689 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.537 103689 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.537 103689 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.537 103689 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.537 103689 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.538 103689 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.539 103689 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.540 103689 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.540 103689 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.540 103689 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.540 103689 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.540 103689 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.540 103689 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.541 103689 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.542 103689 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.542 103689 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.542 103689 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.542 103689 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.542 103689 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.542 103689 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.543 103689 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.544 103689 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.544 103689 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.544 103689 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.544 103689 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.544 103689 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.544 103689 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.545 103689 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.545 103689 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.545 103689 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.545 103689 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.545 103689 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.545 103689 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.546 103689 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.547 103689 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.548 103689 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.549 103689 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.550 103689 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.551 103689 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.552 103689 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.553 103689 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.554 103689 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.555 103689 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.556 103689 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.556 103689 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.556 103689 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.556 103689 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.556 103689 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.556 103689 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.557 103689 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.558 103689 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.559 103689 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.560 103689 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.561 103689 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.562 103689 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.563 103689 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.564 103689 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.565 103689 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.566 103689 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.567 103689 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.568 103689 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.569 103689 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.570 103689 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.571 103689 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.572 103689 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.573 103689 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.574 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.575 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.576 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.577 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.578 103689 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.579 103689 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:16:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:16:08.579 103689 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  4 01:16:12 np0005470441 systemd-logind[796]: New session 25 of user zuul.
Oct  4 01:16:12 np0005470441 systemd[1]: Started Session 25 of User zuul.
Oct  4 01:16:13 np0005470441 python3.9[103986]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:16:14 np0005470441 python3.9[104142]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:16 np0005470441 python3.9[104307]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:16:16 np0005470441 systemd[1]: Reloading.
Oct  4 01:16:16 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:16:16 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:16:17 np0005470441 python3.9[104493]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:16:17 np0005470441 network[104510]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:16:17 np0005470441 network[104511]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:16:17 np0005470441 network[104512]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:16:23 np0005470441 python3.9[104776]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:23 np0005470441 python3.9[104929]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:24 np0005470441 python3.9[105082]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:25 np0005470441 python3.9[105235]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:26 np0005470441 python3.9[105388]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:27 np0005470441 python3.9[105541]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:27 np0005470441 python3.9[105694]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:16:28 np0005470441 python3.9[105847]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:29 np0005470441 python3.9[105999]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:30 np0005470441 python3.9[106151]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:30 np0005470441 python3.9[106303]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:31 np0005470441 python3.9[106455]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:32 np0005470441 python3.9[106607]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:32 np0005470441 python3.9[106759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:33 np0005470441 python3.9[106911]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:34 np0005470441 python3.9[107063]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:34 np0005470441 python3.9[107215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:35 np0005470441 podman[107297]: 2025-10-04 05:16:35.306395585 +0000 UTC m=+0.059814105 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  4 01:16:35 np0005470441 python3.9[107387]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:36 np0005470441 python3.9[107539]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:36 np0005470441 python3.9[107691]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:37 np0005470441 python3.9[107843]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:16:38 np0005470441 podman[107995]: 2025-10-04 05:16:38.46119278 +0000 UTC m=+0.079901210 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:16:38 np0005470441 python3.9[107996]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:39 np0005470441 python3.9[108171]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  4 01:16:40 np0005470441 python3.9[108323]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:16:40 np0005470441 systemd[1]: Reloading.
Oct  4 01:16:40 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:16:40 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:16:41 np0005470441 python3.9[108511]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:42 np0005470441 python3.9[108664]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:42 np0005470441 python3.9[108817]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:43 np0005470441 python3.9[108970]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:43 np0005470441 python3.9[109123]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:44 np0005470441 python3.9[109276]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:45 np0005470441 python3.9[109429]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:16:46 np0005470441 python3.9[109582]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  4 01:16:47 np0005470441 python3.9[109735]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  4 01:16:49 np0005470441 python3.9[109893]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  4 01:16:51 np0005470441 python3.9[110053]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:16:52 np0005470441 python3.9[110137]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:17:06 np0005470441 podman[110322]: 2025-10-04 05:17:06.320034989 +0000 UTC m=+0.076533615 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:17:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:17:06.722 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:17:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:17:06.723 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:17:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:17:06.723 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:17:09 np0005470441 podman[110341]: 2025-10-04 05:17:09.386267941 +0000 UTC m=+0.132660088 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Oct  4 01:17:25 np0005470441 kernel: SELinux:  Converting 2752 SID table entries...
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 01:17:25 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 01:17:37 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct  4 01:17:37 np0005470441 podman[110385]: 2025-10-04 05:17:37.317102739 +0000 UTC m=+0.061395113 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  4 01:17:38 np0005470441 kernel: SELinux:  Converting 2752 SID table entries...
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 01:17:38 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 01:17:40 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct  4 01:17:40 np0005470441 podman[110412]: 2025-10-04 05:17:40.359806366 +0000 UTC m=+0.098621709 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:18:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:18:06.723 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:18:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:18:06.724 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:18:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:18:06.724 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:18:08 np0005470441 podman[120263]: 2025-10-04 05:18:08.308041227 +0000 UTC m=+0.058778924 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:18:11 np0005470441 podman[121972]: 2025-10-04 05:18:11.342548552 +0000 UTC m=+0.088307223 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:18:38 np0005470441 kernel: SELinux:  Converting 2753 SID table entries...
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability network_peer_controls=1
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability open_perms=1
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability extended_socket_class=1
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability always_check_network=0
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  4 01:18:38 np0005470441 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  4 01:18:39 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct  4 01:18:39 np0005470441 podman[127244]: 2025-10-04 05:18:39.328207059 +0000 UTC m=+0.069184315 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:18:40 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:18:40 np0005470441 dbus-broker-launch[759]: Noticed file-system modification, trigger reload.
Oct  4 01:18:41 np0005470441 podman[127315]: 2025-10-04 05:18:41.868606468 +0000 UTC m=+0.092011934 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Oct  4 01:18:48 np0005470441 systemd[1]: Stopping OpenSSH server daemon...
Oct  4 01:18:48 np0005470441 systemd[1]: sshd.service: Deactivated successfully.
Oct  4 01:18:48 np0005470441 systemd[1]: Stopped OpenSSH server daemon.
Oct  4 01:18:48 np0005470441 systemd[1]: sshd.service: Consumed 1.265s CPU time, read 532.0K from disk, written 0B to disk.
Oct  4 01:18:48 np0005470441 systemd[1]: Stopped target sshd-keygen.target.
Oct  4 01:18:48 np0005470441 systemd[1]: Stopping sshd-keygen.target...
Oct  4 01:18:48 np0005470441 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  4 01:18:48 np0005470441 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  4 01:18:48 np0005470441 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  4 01:18:48 np0005470441 systemd[1]: Reached target sshd-keygen.target.
Oct  4 01:18:48 np0005470441 systemd[1]: Starting OpenSSH server daemon...
Oct  4 01:18:48 np0005470441 systemd[1]: Started OpenSSH server daemon.
Oct  4 01:18:50 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:18:50 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:18:50 np0005470441 systemd[1]: Reloading.
Oct  4 01:18:50 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:18:50 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:18:50 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:18:55 np0005470441 systemd[1]: Starting PackageKit Daemon...
Oct  4 01:18:55 np0005470441 systemd[1]: Started PackageKit Daemon.
Oct  4 01:18:57 np0005470441 python3.9[134848]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:18:57 np0005470441 systemd[1]: Reloading.
Oct  4 01:18:57 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:18:57 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:18:58 np0005470441 python3.9[136137]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:18:58 np0005470441 systemd[1]: Reloading.
Oct  4 01:18:58 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:18:58 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:18:59 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:18:59 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:18:59 np0005470441 systemd[1]: man-db-cache-update.service: Consumed 10.405s CPU time.
Oct  4 01:18:59 np0005470441 systemd[1]: run-rdd3147f122c74e3dab2adc6840070ca0.service: Deactivated successfully.
Oct  4 01:18:59 np0005470441 python3.9[137076]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:18:59 np0005470441 systemd[1]: Reloading.
Oct  4 01:18:59 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:18:59 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:00 np0005470441 python3.9[137266]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:19:00 np0005470441 systemd[1]: Reloading.
Oct  4 01:19:00 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:19:00 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:01 np0005470441 python3.9[137456]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:01 np0005470441 systemd[1]: Reloading.
Oct  4 01:19:01 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:19:01 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:02 np0005470441 python3.9[137646]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:02 np0005470441 systemd[1]: Reloading.
Oct  4 01:19:02 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:19:02 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:03 np0005470441 python3.9[137836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:04 np0005470441 systemd[1]: Reloading.
Oct  4 01:19:04 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:19:04 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:05 np0005470441 python3.9[138027]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:05 np0005470441 python3.9[138182]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:05 np0005470441 systemd[1]: Reloading.
Oct  4 01:19:05 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:19:05 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:19:06.724 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:19:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:19:06.725 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:19:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:19:06.725 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:19:07 np0005470441 python3.9[138372]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  4 01:19:07 np0005470441 systemd[1]: Reloading.
Oct  4 01:19:07 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:19:07 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:19:07 np0005470441 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  4 01:19:07 np0005470441 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  4 01:19:08 np0005470441 python3.9[138565]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:09 np0005470441 python3.9[138720]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:09 np0005470441 podman[138847]: 2025-10-04 05:19:09.815459672 +0000 UTC m=+0.065989858 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:19:10 np0005470441 python3.9[138894]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:11 np0005470441 python3.9[139050]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:11 np0005470441 python3.9[139205]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:12 np0005470441 podman[139207]: 2025-10-04 05:19:12.097394565 +0000 UTC m=+0.095379385 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  4 01:19:12 np0005470441 python3.9[139386]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:13 np0005470441 python3.9[139541]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:14 np0005470441 python3.9[139696]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:15 np0005470441 python3.9[139851]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:16 np0005470441 python3.9[140006]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:16 np0005470441 python3.9[140161]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:17 np0005470441 python3.9[140316]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:18 np0005470441 python3.9[140471]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:19 np0005470441 python3.9[140626]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  4 01:19:21 np0005470441 python3.9[140781]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:19:21 np0005470441 python3.9[140933]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:19:22 np0005470441 python3.9[141085]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:19:22 np0005470441 python3.9[141237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:19:23 np0005470441 python3.9[141389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:19:24 np0005470441 python3.9[141541]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:19:25 np0005470441 python3.9[141693]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:26 np0005470441 python3.9[141818]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555164.766141-1623-84558473659454/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:26 np0005470441 python3.9[141970]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:27 np0005470441 python3.9[142095]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555166.2820966-1623-208403276444345/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:27 np0005470441 python3.9[142247]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:28 np0005470441 python3.9[142372]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555167.435699-1623-210232287764776/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:29 np0005470441 python3.9[142524]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:29 np0005470441 python3.9[142649]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555168.5955472-1623-173581087910607/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:30 np0005470441 python3.9[142801]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:30 np0005470441 python3.9[142926]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555169.8199632-1623-171672455045779/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:31 np0005470441 python3.9[143078]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:32 np0005470441 python3.9[143203]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555171.0448344-1623-189806233486442/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:32 np0005470441 python3.9[143355]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:33 np0005470441 python3.9[143478]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555172.2819197-1623-9130879017179/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:34 np0005470441 python3.9[143630]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:34 np0005470441 python3.9[143755]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759555173.5060816-1623-36178609277726/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:35 np0005470441 python3.9[143907]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  4 01:19:36 np0005470441 python3.9[144060]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:37 np0005470441 python3.9[144212]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:37 np0005470441 python3.9[144364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:38 np0005470441 python3.9[144516]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:39 np0005470441 python3.9[144668]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:39 np0005470441 python3.9[144820]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:40 np0005470441 podman[144944]: 2025-10-04 05:19:40.09146423 +0000 UTC m=+0.054599993 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 01:19:40 np0005470441 python3.9[144991]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:40 np0005470441 python3.9[145143]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:41 np0005470441 python3.9[145295]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:42 np0005470441 python3.9[145447]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:42 np0005470441 podman[145448]: 2025-10-04 05:19:42.324887681 +0000 UTC m=+0.074521807 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:19:42 np0005470441 python3.9[145623]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:43 np0005470441 python3.9[145775]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:44 np0005470441 python3.9[145927]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:44 np0005470441 python3.9[146079]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:45 np0005470441 python3.9[146231]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:46 np0005470441 python3.9[146354]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555185.2385104-2286-72081709732930/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:46 np0005470441 python3.9[146506]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:47 np0005470441 python3.9[146629]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555186.4126613-2286-33019203866232/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:48 np0005470441 python3.9[146781]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:48 np0005470441 python3.9[146904]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555187.5510457-2286-91490952422095/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:49 np0005470441 python3.9[147056]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:49 np0005470441 python3.9[147179]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555188.768182-2286-127723232841610/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:50 np0005470441 python3.9[147331]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:50 np0005470441 python3.9[147454]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555189.9315743-2286-10218279085434/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:51 np0005470441 python3.9[147606]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:52 np0005470441 python3.9[147729]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555191.0455217-2286-262732756552071/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:52 np0005470441 python3.9[147881]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:53 np0005470441 python3.9[148004]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555192.2234075-2286-231530168217817/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:53 np0005470441 python3.9[148156]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:54 np0005470441 python3.9[148279]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555193.3353503-2286-7897254028243/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:54 np0005470441 python3.9[148431]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:55 np0005470441 python3.9[148554]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555194.478135-2286-69547990077897/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:56 np0005470441 python3.9[148706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:56 np0005470441 python3.9[148829]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555195.60182-2286-139023146182644/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:57 np0005470441 python3.9[148981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:57 np0005470441 python3.9[149104]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555196.8011403-2286-19909863440392/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:58 np0005470441 python3.9[149256]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:19:58 np0005470441 python3.9[149379]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555197.9912841-2286-5661322246852/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:19:59 np0005470441 python3.9[149531]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:00 np0005470441 python3.9[149654]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555199.109691-2286-266588881268374/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:00 np0005470441 python3.9[149806]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:01 np0005470441 python3.9[149929]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555200.2693694-2286-163877905827517/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:05 np0005470441 python3.9[150079]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:20:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:20:06.725 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:20:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:20:06.725 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:20:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:20:06.725 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:20:06 np0005470441 python3.9[150234]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  4 01:20:10 np0005470441 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct  4 01:20:10 np0005470441 podman[150263]: 2025-10-04 05:20:10.321208547 +0000 UTC m=+0.065775893 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:20:11 np0005470441 python3.9[150408]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:12 np0005470441 python3.9[150560]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:12 np0005470441 podman[150684]: 2025-10-04 05:20:12.63576699 +0000 UTC m=+0.084591836 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:20:12 np0005470441 python3.9[150734]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:13 np0005470441 python3.9[150892]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:14 np0005470441 python3.9[151044]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:15 np0005470441 python3.9[151196]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:16 np0005470441 python3.9[151348]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:16 np0005470441 python3.9[151500]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:17 np0005470441 python3.9[151652]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:18 np0005470441 python3.9[151804]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:19 np0005470441 python3.9[151956]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:20:19 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:19 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:19 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:19 np0005470441 systemd[1]: Starting libvirt logging daemon socket...
Oct  4 01:20:19 np0005470441 systemd[1]: Listening on libvirt logging daemon socket.
Oct  4 01:20:19 np0005470441 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  4 01:20:19 np0005470441 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  4 01:20:19 np0005470441 systemd[1]: Starting libvirt logging daemon...
Oct  4 01:20:19 np0005470441 systemd[1]: Started libvirt logging daemon.
Oct  4 01:20:20 np0005470441 python3.9[152149]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:20:20 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:20 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:20 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:21 np0005470441 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  4 01:20:21 np0005470441 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  4 01:20:21 np0005470441 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  4 01:20:21 np0005470441 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  4 01:20:21 np0005470441 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  4 01:20:21 np0005470441 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  4 01:20:21 np0005470441 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  4 01:20:21 np0005470441 systemd[1]: Starting libvirt nodedev daemon...
Oct  4 01:20:21 np0005470441 systemd[1]: Started libvirt nodedev daemon.
Oct  4 01:20:21 np0005470441 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  4 01:20:21 np0005470441 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  4 01:20:21 np0005470441 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  4 01:20:21 np0005470441 python3.9[152365]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:20:21 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:22 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:22 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:22 np0005470441 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  4 01:20:22 np0005470441 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  4 01:20:22 np0005470441 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  4 01:20:22 np0005470441 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  4 01:20:22 np0005470441 systemd[1]: Starting libvirt proxy daemon...
Oct  4 01:20:22 np0005470441 systemd[1]: Started libvirt proxy daemon.
Oct  4 01:20:22 np0005470441 setroubleshoot[152186]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 252656ca-f362-4ed6-8e4d-094f6dd32bb9
Oct  4 01:20:22 np0005470441 setroubleshoot[152186]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  4 01:20:22 np0005470441 setroubleshoot[152186]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 252656ca-f362-4ed6-8e4d-094f6dd32bb9
Oct  4 01:20:22 np0005470441 setroubleshoot[152186]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  4 01:20:23 np0005470441 python3.9[152584]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:20:23 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:23 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:23 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:23 np0005470441 systemd[1]: Listening on libvirt locking daemon socket.
Oct  4 01:20:23 np0005470441 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  4 01:20:23 np0005470441 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  4 01:20:23 np0005470441 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  4 01:20:23 np0005470441 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  4 01:20:23 np0005470441 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  4 01:20:23 np0005470441 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  4 01:20:23 np0005470441 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  4 01:20:23 np0005470441 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  4 01:20:23 np0005470441 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  4 01:20:23 np0005470441 systemd[1]: Starting libvirt QEMU daemon...
Oct  4 01:20:23 np0005470441 systemd[1]: Started libvirt QEMU daemon.
Oct  4 01:20:24 np0005470441 python3.9[152797]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:20:24 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:24 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:24 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:24 np0005470441 systemd[1]: Starting libvirt secret daemon socket...
Oct  4 01:20:24 np0005470441 systemd[1]: Listening on libvirt secret daemon socket.
Oct  4 01:20:24 np0005470441 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  4 01:20:24 np0005470441 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  4 01:20:24 np0005470441 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  4 01:20:24 np0005470441 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  4 01:20:24 np0005470441 systemd[1]: Starting libvirt secret daemon...
Oct  4 01:20:24 np0005470441 systemd[1]: Started libvirt secret daemon.
Oct  4 01:20:25 np0005470441 python3.9[153008]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:26 np0005470441 python3.9[153160]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  4 01:20:27 np0005470441 python3.9[153312]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:27 np0005470441 python3.9[153435]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555226.8512933-3321-254031787953629/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:28 np0005470441 python3.9[153587]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:29 np0005470441 python3.9[153739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:30 np0005470441 python3.9[153817]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:30 np0005470441 python3.9[153969]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:31 np0005470441 python3.9[154047]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._nx6f1eu recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:32 np0005470441 python3.9[154199]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:32 np0005470441 python3.9[154277]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:32 np0005470441 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  4 01:20:32 np0005470441 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  4 01:20:33 np0005470441 python3.9[154429]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:20:34 np0005470441 python3[154582]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  4 01:20:35 np0005470441 python3.9[154734]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:35 np0005470441 python3.9[154812]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:36 np0005470441 python3.9[154964]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:36 np0005470441 python3.9[155042]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:37 np0005470441 python3.9[155194]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:38 np0005470441 python3.9[155272]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:38 np0005470441 python3.9[155424]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:39 np0005470441 python3.9[155502]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:40 np0005470441 python3.9[155654]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:40 np0005470441 podman[155751]: 2025-10-04 05:20:40.591602491 +0000 UTC m=+0.055680440 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:20:40 np0005470441 python3.9[155797]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555239.6367786-3696-169971519316706/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:41 np0005470441 python3.9[155950]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:42 np0005470441 python3.9[156102]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:20:43 np0005470441 podman[156205]: 2025-10-04 05:20:43.33812349 +0000 UTC m=+0.082539174 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:20:43 np0005470441 python3.9[156283]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:44 np0005470441 python3.9[156436]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:20:45 np0005470441 python3.9[156589]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:20:45 np0005470441 python3.9[156743]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:20:46 np0005470441 python3.9[156898]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:47 np0005470441 python3.9[157050]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:48 np0005470441 python3.9[157173]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555246.8852813-3912-205936356791604/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:48 np0005470441 python3.9[157325]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:49 np0005470441 python3.9[157448]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555248.3222222-3957-3227604544767/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:50 np0005470441 python3.9[157600]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:20:50 np0005470441 python3.9[157723]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555249.759129-4002-55830632098562/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:20:51 np0005470441 python3.9[157875]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:20:51 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:51 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:51 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:52 np0005470441 systemd[1]: Reached target edpm_libvirt.target.
Oct  4 01:20:53 np0005470441 python3.9[158067]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  4 01:20:53 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:53 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:53 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:53 np0005470441 systemd[1]: Reloading.
Oct  4 01:20:53 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:20:53 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:20:54 np0005470441 systemd[1]: session-25.scope: Deactivated successfully.
Oct  4 01:20:54 np0005470441 systemd[1]: session-25.scope: Consumed 3min 30.770s CPU time.
Oct  4 01:20:54 np0005470441 systemd-logind[796]: Session 25 logged out. Waiting for processes to exit.
Oct  4 01:20:54 np0005470441 systemd-logind[796]: Removed session 25.
Oct  4 01:21:00 np0005470441 systemd-logind[796]: New session 26 of user zuul.
Oct  4 01:21:00 np0005470441 systemd[1]: Started Session 26 of User zuul.
Oct  4 01:21:01 np0005470441 python3.9[158316]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:21:02 np0005470441 python3.9[158472]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:03 np0005470441 python3.9[158624]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:04 np0005470441 python3.9[158776]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:04 np0005470441 python3.9[158928]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  4 01:21:05 np0005470441 python3.9[159080]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:06 np0005470441 python3.9[159232]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:21:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:21:06.725 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:21:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:21:06.727 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:21:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:21:06.727 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:21:07 np0005470441 python3.9[159386]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:21:07 np0005470441 systemd[1]: Reloading.
Oct  4 01:21:07 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:21:07 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:21:09 np0005470441 python3.9[159574]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:21:09 np0005470441 network[159591]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:21:09 np0005470441 network[159592]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:21:09 np0005470441 network[159593]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:21:10 np0005470441 podman[159625]: 2025-10-04 05:21:10.689907419 +0000 UTC m=+0.053637678 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:21:14 np0005470441 podman[159758]: 2025-10-04 05:21:14.341117509 +0000 UTC m=+0.084397473 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:21:15 np0005470441 python3.9[159911]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:21:15 np0005470441 systemd[1]: Reloading.
Oct  4 01:21:16 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:21:16 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:21:17 np0005470441 python3.9[160098]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:21:18 np0005470441 python3.9[160250]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  4 01:21:18 np0005470441 podman[160286]: 2025-10-04 05:21:18.60976099 +0000 UTC m=+0.036909232 container create 8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:21:18 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:21:18 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:21:18 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6356] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/21)
Oct  4 01:21:18 np0005470441 kernel: podman0: port 1(veth0) entered blocking state
Oct  4 01:21:18 np0005470441 kernel: podman0: port 1(veth0) entered disabled state
Oct  4 01:21:18 np0005470441 kernel: veth0: entered allmulticast mode
Oct  4 01:21:18 np0005470441 kernel: veth0: entered promiscuous mode
Oct  4 01:21:18 np0005470441 kernel: podman0: port 1(veth0) entered blocking state
Oct  4 01:21:18 np0005470441 kernel: podman0: port 1(veth0) entered forwarding state
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6528] device (veth0): carrier: link connected
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6533] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6539] device (podman0): carrier: link connected
Oct  4 01:21:18 np0005470441 systemd-udevd[160316]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:21:18 np0005470441 systemd-udevd[160319]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:21:18 np0005470441 podman[160286]: 2025-10-04 05:21:18.591421138 +0000 UTC m=+0.018569410 image pull 13ffa098770f5095913e3dfecd601fec25536aa84ab5f90403cd9d7e0dc55d92 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6923] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6934] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6945] device (podman0): Activation: starting connection 'podman0' (3a6c3cf7-e8d8-41ea-bc7c-0eafc838ce32)
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6946] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6950] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6951] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.6953] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  4 01:21:18 np0005470441 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  4 01:21:18 np0005470441 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.7377] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.7381] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  4 01:21:18 np0005470441 NetworkManager[51690]: <info>  [1759555278.7392] device (podman0): Activation: successful, device activated.
Oct  4 01:21:18 np0005470441 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  4 01:21:18 np0005470441 systemd[1]: Started libpod-conmon-8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d.scope.
Oct  4 01:21:18 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:21:18 np0005470441 podman[160286]: 2025-10-04 05:21:18.98787147 +0000 UTC m=+0.415019722 container init 8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  4 01:21:18 np0005470441 podman[160286]: 2025-10-04 05:21:18.996006322 +0000 UTC m=+0.423154564 container start 8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  4 01:21:18 np0005470441 podman[160286]: 2025-10-04 05:21:18.999342437 +0000 UTC m=+0.426490679 container attach 8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  4 01:21:19 np0005470441 iscsid_config[160444]: iqn.1994-05.com.redhat:8ad8a9f0477b#015
Oct  4 01:21:19 np0005470441 systemd[1]: libpod-8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d.scope: Deactivated successfully.
Oct  4 01:21:19 np0005470441 podman[160286]: 2025-10-04 05:21:19.002663851 +0000 UTC m=+0.429812093 container died 8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:21:19 np0005470441 kernel: podman0: port 1(veth0) entered disabled state
Oct  4 01:21:19 np0005470441 kernel: veth0 (unregistering): left allmulticast mode
Oct  4 01:21:19 np0005470441 kernel: veth0 (unregistering): left promiscuous mode
Oct  4 01:21:19 np0005470441 kernel: podman0: port 1(veth0) entered disabled state
Oct  4 01:21:19 np0005470441 NetworkManager[51690]: <info>  [1759555279.0736] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:21:19 np0005470441 systemd[1]: run-netns-netns\x2da5ea9068\x2d8a75\x2dd5aa\x2d4651\x2d213b7c2c9e02.mount: Deactivated successfully.
Oct  4 01:21:19 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d-userdata-shm.mount: Deactivated successfully.
Oct  4 01:21:19 np0005470441 podman[160286]: 2025-10-04 05:21:19.407816212 +0000 UTC m=+0.834964454 container remove 8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:21:19 np0005470441 systemd[1]: libpod-conmon-8a92f369b1ec20debbc11e4c24f2cf9a22d944755705dc3b550eaa778e144c7d.scope: Deactivated successfully.
Oct  4 01:21:19 np0005470441 python3.9[160250]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Oct  4 01:21:19 np0005470441 python3.9[160250]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  4 01:21:19 np0005470441 systemd[1]: var-lib-containers-storage-overlay-111c838afed9b685eef298197ee1075182a3ee12495280f9d77cc14060697c10-merged.mount: Deactivated successfully.
Oct  4 01:21:24 np0005470441 python3.9[160688]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:25 np0005470441 python3.9[160811]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555283.9266765-318-17405694847031/.source.iscsi _original_basename=.3md41nwo follow=False checksum=6cdea2375955aec3a341c301fa90ea66c4c66e90 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:26 np0005470441 python3.9[160963]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:27 np0005470441 python3.9[161113]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:21:28 np0005470441 python3.9[161267]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:29 np0005470441 python3.9[161419]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:29 np0005470441 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  4 01:21:29 np0005470441 python3.9[161572]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:30 np0005470441 python3.9[161650]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:31 np0005470441 python3.9[161802]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:31 np0005470441 python3.9[161880]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:32 np0005470441 python3.9[162032]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:33 np0005470441 python3.9[162184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:33 np0005470441 python3.9[162262]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:34 np0005470441 python3.9[162414]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:34 np0005470441 python3.9[162492]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:36 np0005470441 python3.9[162644]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:21:36 np0005470441 systemd[1]: Reloading.
Oct  4 01:21:36 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:21:36 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:21:37 np0005470441 python3.9[162834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:37 np0005470441 python3.9[162912]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:38 np0005470441 python3.9[163064]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:39 np0005470441 python3.9[163142]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:39 np0005470441 python3.9[163294]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:21:39 np0005470441 systemd[1]: Reloading.
Oct  4 01:21:39 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:21:39 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:21:40 np0005470441 systemd[1]: Starting Create netns directory...
Oct  4 01:21:40 np0005470441 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  4 01:21:40 np0005470441 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  4 01:21:40 np0005470441 systemd[1]: Finished Create netns directory.
Oct  4 01:21:41 np0005470441 podman[163459]: 2025-10-04 05:21:41.22549865 +0000 UTC m=+0.050958101 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:21:41 np0005470441 python3.9[163506]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:42 np0005470441 python3.9[163658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:42 np0005470441 python3.9[163781]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555301.7276626-780-184175058503279/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:44 np0005470441 python3.9[163933]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:21:44 np0005470441 podman[164057]: 2025-10-04 05:21:44.685274782 +0000 UTC m=+0.112020580 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:21:44 np0005470441 python3.9[164105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:21:45 np0005470441 python3.9[164234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555304.330917-855-273133210045846/.source.json _original_basename=.cbe6wc8w follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:46 np0005470441 python3.9[164386]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:48 np0005470441 python3.9[164813]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  4 01:21:49 np0005470441 python3.9[164965]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:21:50 np0005470441 python3.9[165117]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  4 01:21:52 np0005470441 python3[165296]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:21:52 np0005470441 podman[165330]: 2025-10-04 05:21:52.61109134 +0000 UTC m=+0.048203433 container create 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:21:52 np0005470441 podman[165330]: 2025-10-04 05:21:52.581754615 +0000 UTC m=+0.018866728 image pull 13ffa098770f5095913e3dfecd601fec25536aa84ab5f90403cd9d7e0dc55d92 quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  4 01:21:52 np0005470441 python3[165296]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  4 01:21:53 np0005470441 python3.9[165520]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:21:54 np0005470441 python3.9[165674]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:54 np0005470441 python3.9[165750]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:21:55 np0005470441 python3.9[165901]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555315.0591228-1119-17865987061949/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:21:56 np0005470441 python3.9[165977]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:21:56 np0005470441 systemd[1]: Reloading.
Oct  4 01:21:56 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:21:56 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:21:57 np0005470441 python3.9[166088]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:21:57 np0005470441 systemd[1]: Reloading.
Oct  4 01:21:57 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:21:57 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:21:57 np0005470441 systemd[1]: Starting iscsid container...
Oct  4 01:21:57 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:21:57 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d9f935fe7d1cd0bb8476d6ac27fef2457fc3391f2172859d794d83e7604b38/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  4 01:21:57 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d9f935fe7d1cd0bb8476d6ac27fef2457fc3391f2172859d794d83e7604b38/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  4 01:21:57 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d9f935fe7d1cd0bb8476d6ac27fef2457fc3391f2172859d794d83e7604b38/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  4 01:21:57 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa.
Oct  4 01:21:57 np0005470441 podman[166129]: 2025-10-04 05:21:57.824496958 +0000 UTC m=+0.195830104 container init 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:21:57 np0005470441 iscsid[166144]: + sudo -E kolla_set_configs
Oct  4 01:21:57 np0005470441 podman[166129]: 2025-10-04 05:21:57.857914199 +0000 UTC m=+0.229247355 container start 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  4 01:21:57 np0005470441 podman[166129]: iscsid
Oct  4 01:21:57 np0005470441 systemd[1]: Created slice User Slice of UID 0.
Oct  4 01:21:57 np0005470441 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  4 01:21:57 np0005470441 systemd[1]: Started iscsid container.
Oct  4 01:21:57 np0005470441 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  4 01:21:57 np0005470441 systemd[1]: Starting User Manager for UID 0...
Oct  4 01:21:57 np0005470441 podman[166151]: 2025-10-04 05:21:57.932478701 +0000 UTC m=+0.062571872 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:21:57 np0005470441 systemd[1]: 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa-bb9b2acbad351fc.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:21:57 np0005470441 systemd[1]: 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa-bb9b2acbad351fc.service: Failed with result 'exit-code'.
Oct  4 01:21:58 np0005470441 systemd[166162]: Queued start job for default target Main User Target.
Oct  4 01:21:58 np0005470441 systemd[166162]: Created slice User Application Slice.
Oct  4 01:21:58 np0005470441 systemd[166162]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  4 01:21:58 np0005470441 systemd[166162]: Started Daily Cleanup of User's Temporary Directories.
Oct  4 01:21:58 np0005470441 systemd[166162]: Reached target Paths.
Oct  4 01:21:58 np0005470441 systemd[166162]: Reached target Timers.
Oct  4 01:21:58 np0005470441 systemd[166162]: Starting D-Bus User Message Bus Socket...
Oct  4 01:21:58 np0005470441 systemd[166162]: Starting Create User's Volatile Files and Directories...
Oct  4 01:21:58 np0005470441 systemd[166162]: Finished Create User's Volatile Files and Directories.
Oct  4 01:21:58 np0005470441 systemd[166162]: Listening on D-Bus User Message Bus Socket.
Oct  4 01:21:58 np0005470441 systemd[166162]: Reached target Sockets.
Oct  4 01:21:58 np0005470441 systemd[166162]: Reached target Basic System.
Oct  4 01:21:58 np0005470441 systemd[166162]: Reached target Main User Target.
Oct  4 01:21:58 np0005470441 systemd[166162]: Startup finished in 126ms.
Oct  4 01:21:58 np0005470441 systemd[1]: Started User Manager for UID 0.
Oct  4 01:21:58 np0005470441 systemd[1]: Started Session c3 of User root.
Oct  4 01:21:58 np0005470441 iscsid[166144]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:21:58 np0005470441 iscsid[166144]: INFO:__main__:Validating config file
Oct  4 01:21:58 np0005470441 iscsid[166144]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:21:58 np0005470441 iscsid[166144]: INFO:__main__:Writing out command to execute
Oct  4 01:21:58 np0005470441 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  4 01:21:58 np0005470441 iscsid[166144]: ++ cat /run_command
Oct  4 01:21:58 np0005470441 iscsid[166144]: + CMD='/usr/sbin/iscsid -f'
Oct  4 01:21:58 np0005470441 iscsid[166144]: + ARGS=
Oct  4 01:21:58 np0005470441 iscsid[166144]: + sudo kolla_copy_cacerts
Oct  4 01:21:58 np0005470441 systemd[1]: Started Session c4 of User root.
Oct  4 01:21:58 np0005470441 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  4 01:21:58 np0005470441 iscsid[166144]: + [[ ! -n '' ]]
Oct  4 01:21:58 np0005470441 iscsid[166144]: + . kolla_extend_start
Oct  4 01:21:58 np0005470441 iscsid[166144]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  4 01:21:58 np0005470441 iscsid[166144]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  4 01:21:58 np0005470441 iscsid[166144]: Running command: '/usr/sbin/iscsid -f'
Oct  4 01:21:58 np0005470441 iscsid[166144]: + umask 0022
Oct  4 01:21:58 np0005470441 iscsid[166144]: + exec /usr/sbin/iscsid -f
Oct  4 01:21:58 np0005470441 kernel: Loading iSCSI transport class v2.0-870.
Oct  4 01:21:59 np0005470441 python3.9[166350]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:21:59 np0005470441 python3.9[166502]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:00 np0005470441 python3.9[166654]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:22:00 np0005470441 network[166671]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:22:00 np0005470441 network[166672]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:22:00 np0005470441 network[166673]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:22:05 np0005470441 python3.9[166947]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  4 01:22:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:22:06.727 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:22:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:22:06.727 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:22:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:22:06.727 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:22:06 np0005470441 python3.9[167099]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  4 01:22:07 np0005470441 python3.9[167255]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:08 np0005470441 python3.9[167378]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555327.0636775-1341-115502661317295/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:08 np0005470441 systemd[1]: Stopping User Manager for UID 0...
Oct  4 01:22:08 np0005470441 systemd[166162]: Activating special unit Exit the Session...
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped target Main User Target.
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped target Basic System.
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped target Paths.
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped target Sockets.
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped target Timers.
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  4 01:22:08 np0005470441 systemd[166162]: Closed D-Bus User Message Bus Socket.
Oct  4 01:22:08 np0005470441 systemd[166162]: Stopped Create User's Volatile Files and Directories.
Oct  4 01:22:08 np0005470441 systemd[166162]: Removed slice User Application Slice.
Oct  4 01:22:08 np0005470441 systemd[166162]: Reached target Shutdown.
Oct  4 01:22:08 np0005470441 systemd[166162]: Finished Exit the Session.
Oct  4 01:22:08 np0005470441 systemd[166162]: Reached target Exit the Session.
Oct  4 01:22:08 np0005470441 systemd[1]: user@0.service: Deactivated successfully.
Oct  4 01:22:08 np0005470441 systemd[1]: Stopped User Manager for UID 0.
Oct  4 01:22:08 np0005470441 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  4 01:22:08 np0005470441 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  4 01:22:08 np0005470441 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  4 01:22:08 np0005470441 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  4 01:22:08 np0005470441 systemd[1]: Removed slice User Slice of UID 0.
Oct  4 01:22:09 np0005470441 python3.9[167531]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:10 np0005470441 python3.9[167683]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:22:10 np0005470441 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  4 01:22:10 np0005470441 systemd[1]: Stopped Load Kernel Modules.
Oct  4 01:22:10 np0005470441 systemd[1]: Stopping Load Kernel Modules...
Oct  4 01:22:10 np0005470441 systemd[1]: Starting Load Kernel Modules...
Oct  4 01:22:10 np0005470441 systemd[1]: Finished Load Kernel Modules.
Oct  4 01:22:10 np0005470441 python3.9[167839]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:11 np0005470441 podman[167963]: 2025-10-04 05:22:11.699603819 +0000 UTC m=+0.072026351 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:22:11 np0005470441 python3.9[168008]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:22:12 np0005470441 python3.9[168162]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:22:13 np0005470441 python3.9[168314]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:14 np0005470441 python3.9[168437]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555332.9992363-1515-103751190850489/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:14 np0005470441 podman[168561]: 2025-10-04 05:22:14.826413134 +0000 UTC m=+0.088550831 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:22:14 np0005470441 python3.9[168610]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:22:15 np0005470441 python3.9[168769]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:16 np0005470441 python3.9[168921]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:17 np0005470441 python3.9[169073]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:18 np0005470441 python3.9[169225]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:18 np0005470441 python3.9[169377]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:19 np0005470441 python3.9[169529]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:20 np0005470441 python3.9[169681]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:20 np0005470441 python3.9[169833]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:22:21 np0005470441 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  4 01:22:21 np0005470441 python3.9[169988]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:22 np0005470441 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  4 01:22:22 np0005470441 python3.9[170140]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:23 np0005470441 python3.9[170293]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:23 np0005470441 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  4 01:22:23 np0005470441 python3.9[170372]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:24 np0005470441 python3.9[170524]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:24 np0005470441 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  4 01:22:25 np0005470441 python3.9[170603]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:25 np0005470441 python3.9[170755]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:26 np0005470441 python3.9[170907]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:27 np0005470441 python3.9[170985]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:28 np0005470441 python3.9[171137]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:28 np0005470441 podman[171141]: 2025-10-04 05:22:28.348991611 +0000 UTC m=+0.099672928 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:22:28 np0005470441 python3.9[171236]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:29 np0005470441 python3.9[171388]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:22:29 np0005470441 systemd[1]: Reloading.
Oct  4 01:22:29 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:22:29 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:22:30 np0005470441 python3.9[171577]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:31 np0005470441 python3.9[171655]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:32 np0005470441 python3.9[171807]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:32 np0005470441 python3.9[171885]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:33 np0005470441 python3.9[172037]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:22:33 np0005470441 systemd[1]: Reloading.
Oct  4 01:22:33 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:22:33 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:22:34 np0005470441 systemd[1]: Starting Create netns directory...
Oct  4 01:22:34 np0005470441 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  4 01:22:34 np0005470441 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  4 01:22:34 np0005470441 systemd[1]: Finished Create netns directory.
Oct  4 01:22:34 np0005470441 python3.9[172231]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:35 np0005470441 python3.9[172383]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:36 np0005470441 python3.9[172506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555355.2741067-2136-26537399683193/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:37 np0005470441 python3.9[172658]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:22:38 np0005470441 python3.9[172810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:38 np0005470441 python3.9[172933]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555357.737127-2211-114588980249282/.source.json _original_basename=._d0za41n follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:39 np0005470441 python3.9[173085]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:41 np0005470441 python3.9[173512]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  4 01:22:42 np0005470441 podman[173560]: 2025-10-04 05:22:42.298635594 +0000 UTC m=+0.055418168 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:22:42 np0005470441 python3.9[173681]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:22:43 np0005470441 python3.9[173833]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  4 01:22:45 np0005470441 python3[174009]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:22:45 np0005470441 podman[174045]: 2025-10-04 05:22:45.223015959 +0000 UTC m=+0.063176419 container create 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:22:45 np0005470441 podman[174045]: 2025-10-04 05:22:45.181005533 +0000 UTC m=+0.021166013 image pull 5652055d294fa12a03c8287ea23106a5617d67d9b2c36e3419473120055c6b9a quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  4 01:22:45 np0005470441 python3[174009]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  4 01:22:45 np0005470441 podman[174059]: 2025-10-04 05:22:45.340488172 +0000 UTC m=+0.087119170 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:22:46 np0005470441 python3.9[174259]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:22:47 np0005470441 python3.9[174413]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:47 np0005470441 python3.9[174489]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:22:48 np0005470441 python3.9[174640]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555367.545468-2475-43018136108452/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:48 np0005470441 python3.9[174716]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:22:48 np0005470441 systemd[1]: Reloading.
Oct  4 01:22:48 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:22:48 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:22:49 np0005470441 python3.9[174827]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:22:49 np0005470441 systemd[1]: Reloading.
Oct  4 01:22:49 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:22:49 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:22:49 np0005470441 systemd[1]: Starting multipathd container...
Oct  4 01:22:49 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:22:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8d910e9ac6b4947dab2be7f66d4fd834281565ce471cc37fddcb9ca83a3af7e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  4 01:22:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8d910e9ac6b4947dab2be7f66d4fd834281565ce471cc37fddcb9ca83a3af7e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  4 01:22:50 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.
Oct  4 01:22:50 np0005470441 podman[174868]: 2025-10-04 05:22:50.049200196 +0000 UTC m=+0.144569015 container init 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:22:50 np0005470441 multipathd[174883]: + sudo -E kolla_set_configs
Oct  4 01:22:50 np0005470441 podman[174868]: 2025-10-04 05:22:50.071648945 +0000 UTC m=+0.167017744 container start 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:22:50 np0005470441 podman[174868]: multipathd
Oct  4 01:22:50 np0005470441 systemd[1]: Started multipathd container.
Oct  4 01:22:50 np0005470441 multipathd[174883]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:22:50 np0005470441 multipathd[174883]: INFO:__main__:Validating config file
Oct  4 01:22:50 np0005470441 multipathd[174883]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:22:50 np0005470441 multipathd[174883]: INFO:__main__:Writing out command to execute
Oct  4 01:22:50 np0005470441 multipathd[174883]: ++ cat /run_command
Oct  4 01:22:50 np0005470441 multipathd[174883]: + CMD='/usr/sbin/multipathd -d'
Oct  4 01:22:50 np0005470441 multipathd[174883]: + ARGS=
Oct  4 01:22:50 np0005470441 multipathd[174883]: + sudo kolla_copy_cacerts
Oct  4 01:22:50 np0005470441 podman[174890]: 2025-10-04 05:22:50.150248212 +0000 UTC m=+0.066515394 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Oct  4 01:22:50 np0005470441 systemd[1]: 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8-1c00a3a3319f44e1.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:22:50 np0005470441 systemd[1]: 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8-1c00a3a3319f44e1.service: Failed with result 'exit-code'.
Oct  4 01:22:50 np0005470441 multipathd[174883]: + [[ ! -n '' ]]
Oct  4 01:22:50 np0005470441 multipathd[174883]: + . kolla_extend_start
Oct  4 01:22:50 np0005470441 multipathd[174883]: Running command: '/usr/sbin/multipathd -d'
Oct  4 01:22:50 np0005470441 multipathd[174883]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  4 01:22:50 np0005470441 multipathd[174883]: + umask 0022
Oct  4 01:22:50 np0005470441 multipathd[174883]: + exec /usr/sbin/multipathd -d
Oct  4 01:22:50 np0005470441 multipathd[174883]: 3267.893185 | --------start up--------
Oct  4 01:22:50 np0005470441 multipathd[174883]: 3267.893208 | read /etc/multipath.conf
Oct  4 01:22:50 np0005470441 multipathd[174883]: 3267.898735 | path checkers start up
Oct  4 01:22:51 np0005470441 python3.9[175071]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:22:52 np0005470441 python3.9[175225]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:22:53 np0005470441 python3.9[175390]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:22:53 np0005470441 systemd[1]: Stopping multipathd container...
Oct  4 01:22:53 np0005470441 multipathd[174883]: 3271.446506 | exit (signal)
Oct  4 01:22:53 np0005470441 multipathd[174883]: 3271.446563 | --------shut down-------
Oct  4 01:22:53 np0005470441 systemd[1]: libpod-3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.scope: Deactivated successfully.
Oct  4 01:22:53 np0005470441 podman[175394]: 2025-10-04 05:22:53.763400568 +0000 UTC m=+0.065337760 container died 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  4 01:22:53 np0005470441 systemd[1]: 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8-1c00a3a3319f44e1.timer: Deactivated successfully.
Oct  4 01:22:53 np0005470441 systemd[1]: Stopped /usr/bin/podman healthcheck run 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.
Oct  4 01:22:53 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8-userdata-shm.mount: Deactivated successfully.
Oct  4 01:22:53 np0005470441 systemd[1]: var-lib-containers-storage-overlay-d8d910e9ac6b4947dab2be7f66d4fd834281565ce471cc37fddcb9ca83a3af7e-merged.mount: Deactivated successfully.
Oct  4 01:22:53 np0005470441 podman[175394]: 2025-10-04 05:22:53.80808494 +0000 UTC m=+0.110022132 container cleanup 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:22:53 np0005470441 podman[175394]: multipathd
Oct  4 01:22:53 np0005470441 podman[175421]: multipathd
Oct  4 01:22:53 np0005470441 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  4 01:22:53 np0005470441 systemd[1]: Stopped multipathd container.
Oct  4 01:22:53 np0005470441 systemd[1]: Starting multipathd container...
Oct  4 01:22:53 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:22:53 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8d910e9ac6b4947dab2be7f66d4fd834281565ce471cc37fddcb9ca83a3af7e/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  4 01:22:53 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8d910e9ac6b4947dab2be7f66d4fd834281565ce471cc37fddcb9ca83a3af7e/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  4 01:22:53 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.
Oct  4 01:22:53 np0005470441 podman[175434]: 2025-10-04 05:22:53.987814575 +0000 UTC m=+0.096418055 container init 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:22:53 np0005470441 multipathd[175449]: + sudo -E kolla_set_configs
Oct  4 01:22:54 np0005470441 podman[175434]: 2025-10-04 05:22:54.013049543 +0000 UTC m=+0.121653003 container start 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  4 01:22:54 np0005470441 podman[175434]: multipathd
Oct  4 01:22:54 np0005470441 systemd[1]: Started multipathd container.
Oct  4 01:22:54 np0005470441 multipathd[175449]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:22:54 np0005470441 multipathd[175449]: INFO:__main__:Validating config file
Oct  4 01:22:54 np0005470441 multipathd[175449]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:22:54 np0005470441 multipathd[175449]: INFO:__main__:Writing out command to execute
Oct  4 01:22:54 np0005470441 multipathd[175449]: ++ cat /run_command
Oct  4 01:22:54 np0005470441 multipathd[175449]: + CMD='/usr/sbin/multipathd -d'
Oct  4 01:22:54 np0005470441 multipathd[175449]: + ARGS=
Oct  4 01:22:54 np0005470441 multipathd[175449]: + sudo kolla_copy_cacerts
Oct  4 01:22:54 np0005470441 podman[175456]: 2025-10-04 05:22:54.087697998 +0000 UTC m=+0.058472455 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  4 01:22:54 np0005470441 multipathd[175449]: + [[ ! -n '' ]]
Oct  4 01:22:54 np0005470441 multipathd[175449]: + . kolla_extend_start
Oct  4 01:22:54 np0005470441 multipathd[175449]: Running command: '/usr/sbin/multipathd -d'
Oct  4 01:22:54 np0005470441 multipathd[175449]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  4 01:22:54 np0005470441 multipathd[175449]: + umask 0022
Oct  4 01:22:54 np0005470441 multipathd[175449]: + exec /usr/sbin/multipathd -d
Oct  4 01:22:54 np0005470441 systemd[1]: 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8-4f25ac5b99b81a89.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:22:54 np0005470441 systemd[1]: 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8-4f25ac5b99b81a89.service: Failed with result 'exit-code'.
Oct  4 01:22:54 np0005470441 multipathd[175449]: 3271.818946 | --------start up--------
Oct  4 01:22:54 np0005470441 multipathd[175449]: 3271.818968 | read /etc/multipath.conf
Oct  4 01:22:54 np0005470441 multipathd[175449]: 3271.824335 | path checkers start up
Oct  4 01:22:55 np0005470441 python3.9[175640]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:56 np0005470441 python3.9[175792]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  4 01:22:57 np0005470441 python3.9[175944]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  4 01:22:57 np0005470441 kernel: Key type psk registered
Oct  4 01:22:58 np0005470441 python3.9[176108]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:22:58 np0005470441 podman[176203]: 2025-10-04 05:22:58.859604901 +0000 UTC m=+0.087277715 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:22:59 np0005470441 python3.9[176250]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555377.9857423-2715-161987478148940/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:22:59 np0005470441 python3.9[176403]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:00 np0005470441 python3.9[176555]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:23:00 np0005470441 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  4 01:23:00 np0005470441 systemd[1]: Stopped Load Kernel Modules.
Oct  4 01:23:00 np0005470441 systemd[1]: Stopping Load Kernel Modules...
Oct  4 01:23:00 np0005470441 systemd[1]: Starting Load Kernel Modules...
Oct  4 01:23:00 np0005470441 systemd[1]: Finished Load Kernel Modules.
Oct  4 01:23:01 np0005470441 python3.9[176711]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  4 01:23:02 np0005470441 python3.9[176795]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  4 01:23:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:23:06.728 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:23:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:23:06.729 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:23:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:23:06.729 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:23:10 np0005470441 systemd[1]: Reloading.
Oct  4 01:23:10 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:23:10 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:23:10 np0005470441 systemd[1]: Reloading.
Oct  4 01:23:10 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:23:10 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:23:11 np0005470441 systemd-logind[796]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  4 01:23:11 np0005470441 systemd-logind[796]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  4 01:23:11 np0005470441 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  4 01:23:11 np0005470441 systemd[1]: Starting man-db-cache-update.service...
Oct  4 01:23:11 np0005470441 systemd[1]: Reloading.
Oct  4 01:23:11 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:23:11 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:23:12 np0005470441 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  4 01:23:13 np0005470441 podman[178095]: 2025-10-04 05:23:13.316382055 +0000 UTC m=+0.060710699 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:23:13 np0005470441 python3.9[178265]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:14 np0005470441 python3.9[178415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:23:15 np0005470441 podman[178474]: 2025-10-04 05:23:15.595538237 +0000 UTC m=+0.094208982 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, org.label-schema.build-date=20251001, config_id=ovn_controller)
Oct  4 01:23:15 np0005470441 python3.9[178597]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:16 np0005470441 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  4 01:23:16 np0005470441 systemd[1]: Finished man-db-cache-update.service.
Oct  4 01:23:16 np0005470441 systemd[1]: man-db-cache-update.service: Consumed 1.528s CPU time.
Oct  4 01:23:16 np0005470441 systemd[1]: run-r728d583017a14dabbebcc1af2844ce15.service: Deactivated successfully.
Oct  4 01:23:17 np0005470441 python3.9[178750]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:23:17 np0005470441 systemd[1]: Reloading.
Oct  4 01:23:17 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:23:18 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:23:18 np0005470441 python3.9[178935]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:23:18 np0005470441 network[178952]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:23:18 np0005470441 network[178953]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:23:18 np0005470441 network[178954]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:23:23 np0005470441 python3.9[179231]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:24 np0005470441 podman[179385]: 2025-10-04 05:23:24.321588271 +0000 UTC m=+0.074267074 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:23:24 np0005470441 python3.9[179384]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:25 np0005470441 python3.9[179559]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:25 np0005470441 python3.9[179712]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:26 np0005470441 python3.9[179865]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:27 np0005470441 python3.9[180018]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:28 np0005470441 python3.9[180171]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:28 np0005470441 python3.9[180324]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:23:29 np0005470441 podman[180326]: 2025-10-04 05:23:29.08878905 +0000 UTC m=+0.061359446 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:23:31 np0005470441 python3.9[180500]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:32 np0005470441 python3.9[180652]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:32 np0005470441 python3.9[180804]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:33 np0005470441 python3.9[180956]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:33 np0005470441 python3.9[181108]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:34 np0005470441 python3.9[181260]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:35 np0005470441 python3.9[181412]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:35 np0005470441 python3.9[181564]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:37 np0005470441 python3.9[181716]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:38 np0005470441 python3.9[181868]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:38 np0005470441 python3.9[182020]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:39 np0005470441 python3.9[182172]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:39 np0005470441 python3.9[182324]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:40 np0005470441 python3.9[182476]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:41 np0005470441 python3.9[182628]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:41 np0005470441 python3.9[182780]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:23:43 np0005470441 python3.9[182932]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:43 np0005470441 podman[183058]: 2025-10-04 05:23:43.916857709 +0000 UTC m=+0.049782038 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:23:44 np0005470441 python3.9[183101]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  4 01:23:44 np0005470441 python3.9[183255]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:23:44 np0005470441 systemd[1]: Reloading.
Oct  4 01:23:45 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:23:45 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:23:46 np0005470441 podman[183414]: 2025-10-04 05:23:46.105538887 +0000 UTC m=+0.094109000 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  4 01:23:46 np0005470441 python3.9[183460]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:46 np0005470441 python3.9[183621]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:47 np0005470441 python3.9[183774]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:48 np0005470441 python3.9[183927]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:48 np0005470441 python3.9[184080]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:49 np0005470441 python3.9[184233]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:50 np0005470441 python3.9[184386]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:50 np0005470441 python3.9[184539]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:23:52 np0005470441 python3.9[184692]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:53 np0005470441 python3.9[184844]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:53 np0005470441 python3.9[184996]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:54 np0005470441 podman[185120]: 2025-10-04 05:23:54.564560491 +0000 UTC m=+0.056086987 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:23:54 np0005470441 python3.9[185168]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:55 np0005470441 python3.9[185320]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:55 np0005470441 python3.9[185472]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:56 np0005470441 python3.9[185624]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:57 np0005470441 python3.9[185776]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:57 np0005470441 python3.9[185928]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:58 np0005470441 python3.9[186080]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:23:59 np0005470441 podman[186180]: 2025-10-04 05:23:59.322279881 +0000 UTC m=+0.075331565 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid)
Oct  4 01:23:59 np0005470441 python3.9[186252]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:00 np0005470441 python3.9[186404]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:06 np0005470441 python3.9[186556]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  4 01:24:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:24:06.729 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:24:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:24:06.730 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:24:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:24:06.730 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:24:07 np0005470441 python3.9[186709]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  4 01:24:08 np0005470441 python3.9[186867]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  4 01:24:12 np0005470441 systemd-logind[796]: New session 28 of user zuul.
Oct  4 01:24:12 np0005470441 systemd[1]: Started Session 28 of User zuul.
Oct  4 01:24:12 np0005470441 systemd[1]: session-28.scope: Deactivated successfully.
Oct  4 01:24:12 np0005470441 systemd-logind[796]: Session 28 logged out. Waiting for processes to exit.
Oct  4 01:24:12 np0005470441 systemd-logind[796]: Removed session 28.
Oct  4 01:24:13 np0005470441 python3.9[187053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:13 np0005470441 python3.9[187174]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555452.5667694-4332-237388985599716/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:14 np0005470441 podman[187298]: 2025-10-04 05:24:14.004182571 +0000 UTC m=+0.058011362 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  4 01:24:14 np0005470441 python3.9[187337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:14 np0005470441 python3.9[187420]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:15 np0005470441 python3.9[187570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:15 np0005470441 python3.9[187691]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555454.857755-4332-138746451416329/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:16 np0005470441 podman[187815]: 2025-10-04 05:24:16.33031529 +0000 UTC m=+0.082923790 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Oct  4 01:24:16 np0005470441 python3.9[187850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:17 np0005470441 python3.9[187988]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555456.0151525-4332-163382821967602/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:17 np0005470441 python3.9[188138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:18 np0005470441 python3.9[188259]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555457.2350004-4332-259402686172229/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:19 np0005470441 python3.9[188411]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:24:20 np0005470441 python3.9[188563]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:24:20 np0005470441 python3.9[188715]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:21 np0005470441 python3.9[188867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:22 np0005470441 python3.9[188990]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759555461.1273468-4611-123752525679106/.source _original_basename=.4ti8j8oa follow=False checksum=f96f48fcd951a59076fa21f5c60b874850ff7dc7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  4 01:24:23 np0005470441 python3.9[189142]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:23 np0005470441 python3.9[189294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:24 np0005470441 podman[189389]: 2025-10-04 05:24:24.751342754 +0000 UTC m=+0.051507257 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:24:24 np0005470441 python3.9[189428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555463.4722128-4689-70333107082184/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:25 np0005470441 python3.9[189585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:24:26 np0005470441 python3.9[189706]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555465.3151824-4734-1622438727544/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:24:27 np0005470441 python3.9[189858]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  4 01:24:28 np0005470441 python3.9[190010]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:24:29 np0005470441 podman[190134]: 2025-10-04 05:24:29.443955501 +0000 UTC m=+0.052667620 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid)
Oct  4 01:24:29 np0005470441 python3[190181]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:24:29 np0005470441 podman[190217]: 2025-10-04 05:24:29.900339479 +0000 UTC m=+0.049854200 container create f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251001)
Oct  4 01:24:29 np0005470441 podman[190217]: 2025-10-04 05:24:29.873791413 +0000 UTC m=+0.023306164 image pull 2c237c6794a3227fe6b226ac969a4d71d1a5c1686381c5edb016d0bc4442832a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  4 01:24:29 np0005470441 python3[190181]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  4 01:24:32 np0005470441 python3.9[190407]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:33 np0005470441 python3.9[190561]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  4 01:24:34 np0005470441 python3.9[190713]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:24:36 np0005470441 python3[190865]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:24:36 np0005470441 podman[190903]: 2025-10-04 05:24:36.214640247 +0000 UTC m=+0.017847239 image pull 2c237c6794a3227fe6b226ac969a4d71d1a5c1686381c5edb016d0bc4442832a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  4 01:24:36 np0005470441 podman[190903]: 2025-10-04 05:24:36.410221383 +0000 UTC m=+0.213428355 container create 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251001, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2)
Oct  4 01:24:36 np0005470441 python3[190865]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct  4 01:24:37 np0005470441 python3.9[191093]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:38 np0005470441 python3.9[191247]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:24:39 np0005470441 python3.9[191398]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555478.3272567-5010-102357733446499/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:24:39 np0005470441 python3.9[191474]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:24:40 np0005470441 systemd[1]: Reloading.
Oct  4 01:24:40 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:24:40 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:24:40 np0005470441 python3.9[191585]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:24:40 np0005470441 systemd[1]: Reloading.
Oct  4 01:24:41 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:24:41 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:24:41 np0005470441 systemd[1]: Starting nova_compute container...
Oct  4 01:24:41 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:24:41 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:41 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:41 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:41 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:41 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:41 np0005470441 podman[191625]: 2025-10-04 05:24:41.449686 +0000 UTC m=+0.138529743 container init 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  4 01:24:41 np0005470441 podman[191625]: 2025-10-04 05:24:41.469054552 +0000 UTC m=+0.157898215 container start 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + sudo -E kolla_set_configs
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Validating config file
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying service configuration files
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Deleting /etc/ceph
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Creating directory /etc/ceph
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /etc/ceph
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Writing out command to execute
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:41 np0005470441 nova_compute[191640]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  4 01:24:41 np0005470441 nova_compute[191640]: ++ cat /run_command
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + CMD=nova-compute
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + ARGS=
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + sudo kolla_copy_cacerts
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + [[ ! -n '' ]]
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + . kolla_extend_start
Oct  4 01:24:41 np0005470441 nova_compute[191640]: Running command: 'nova-compute'
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + echo 'Running command: '\''nova-compute'\'''
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + umask 0022
Oct  4 01:24:41 np0005470441 nova_compute[191640]: + exec nova-compute
Oct  4 01:24:41 np0005470441 podman[191625]: nova_compute
Oct  4 01:24:41 np0005470441 systemd[1]: Started nova_compute container.
Oct  4 01:24:42 np0005470441 python3.9[191802]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:43 np0005470441 nova_compute[191640]: 2025-10-04 05:24:43.615 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  4 01:24:43 np0005470441 nova_compute[191640]: 2025-10-04 05:24:43.616 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  4 01:24:43 np0005470441 nova_compute[191640]: 2025-10-04 05:24:43.616 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  4 01:24:43 np0005470441 nova_compute[191640]: 2025-10-04 05:24:43.616 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  4 01:24:43 np0005470441 python3.9[191952]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:43 np0005470441 nova_compute[191640]: 2025-10-04 05:24:43.763 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:24:43 np0005470441 nova_compute[191640]: 2025-10-04 05:24:43.774 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.330 2 INFO nova.virt.driver [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  4 01:24:44 np0005470441 podman[192080]: 2025-10-04 05:24:44.330896365 +0000 UTC m=+0.055085379 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.425 2 INFO nova.compute.provider_config [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.463 2 DEBUG oslo_concurrency.lockutils [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.463 2 DEBUG oslo_concurrency.lockutils [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.463 2 DEBUG oslo_concurrency.lockutils [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.464 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.464 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.464 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.464 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.464 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.464 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.465 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.465 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.465 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.465 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.465 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.465 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.466 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.467 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.468 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.468 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.468 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.468 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.468 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.468 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.469 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.470 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.470 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.470 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.470 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.470 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.470 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.471 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.471 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.471 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.471 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.471 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.471 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.472 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.472 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.472 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.472 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.472 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.472 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.473 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.474 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.475 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.475 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.475 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.475 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.475 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.476 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.477 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.478 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.478 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.478 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.478 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.478 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.478 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.479 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.479 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.479 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.479 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.479 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.480 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.480 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.480 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.480 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.480 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.480 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.481 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.481 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.481 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.481 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.481 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.482 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.483 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.484 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.485 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.486 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.487 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.487 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.487 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.487 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.487 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.487 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.488 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.488 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.488 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.488 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.488 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.488 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.489 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.490 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.490 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.490 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.490 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.490 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.490 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.491 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.492 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.493 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.493 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.493 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.493 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.493 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.493 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.494 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.494 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.494 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.494 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.494 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.494 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.495 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.496 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.496 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.496 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.496 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.496 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.496 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.497 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.498 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.498 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.498 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.498 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.498 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.498 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.499 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.500 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.501 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.502 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.503 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.503 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.503 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.503 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.503 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.503 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.504 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.505 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.506 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.507 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.507 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.507 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.507 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.507 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.507 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.508 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.509 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.510 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.511 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.512 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.513 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.513 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.513 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.513 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.513 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.514 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.515 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.516 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.516 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.516 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.516 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.516 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.517 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.517 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.517 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.517 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 python3.9[192116]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.517 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.517 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.518 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.518 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.518 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.518 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.518 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.518 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.519 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.519 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.519 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.519 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.519 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.519 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.520 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.520 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.520 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.520 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.520 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.521 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.521 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.521 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.521 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.521 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.521 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.522 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.523 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.523 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.523 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.523 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.523 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.524 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.524 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.524 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.524 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.524 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.525 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.526 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.527 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.528 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.529 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.530 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.530 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.530 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.530 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.530 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.530 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.531 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.532 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.533 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.533 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.533 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.533 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.533 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.533 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.534 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.535 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.536 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.536 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.536 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.536 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.536 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.537 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.537 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.537 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.537 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.537 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.538 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.539 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.540 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.540 2 WARNING oslo_config.cfg [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  4 01:24:44 np0005470441 nova_compute[191640]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  4 01:24:44 np0005470441 nova_compute[191640]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  4 01:24:44 np0005470441 nova_compute[191640]: and ``live_migration_inbound_addr`` respectively.
Oct  4 01:24:44 np0005470441 nova_compute[191640]: ).  Its value may be silently ignored in the future.#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.540 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.540 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.540 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.540 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.541 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.541 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.541 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.541 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.541 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.541 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.542 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.543 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.544 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.544 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.544 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.544 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.544 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.544 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.545 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.545 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.545 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.545 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.545 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.546 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.547 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.548 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.548 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.548 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.548 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.548 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.548 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.549 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.549 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.549 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.549 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.549 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.550 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.551 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.551 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.551 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.551 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.551 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.552 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.553 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.553 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.553 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.553 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.553 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.554 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.554 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.554 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.554 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.554 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.554 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.555 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.556 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.556 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.556 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.556 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.556 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.557 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.557 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.557 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.557 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.557 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.557 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.558 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.559 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.559 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.559 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.559 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.559 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.559 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.560 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.560 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.560 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.560 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.560 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.560 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.561 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.561 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.561 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.561 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.561 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.561 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.562 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.562 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.562 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.562 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.562 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.562 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.563 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.563 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.563 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.563 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.563 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.563 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.564 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.564 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.564 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.564 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.564 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.564 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.565 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.565 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.565 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.565 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.565 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.565 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.566 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.567 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.567 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.567 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.567 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.567 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.567 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.568 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.568 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.568 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.568 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.568 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.569 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.569 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.569 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.569 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.569 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.569 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.570 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.571 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.572 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.572 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.572 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.572 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.572 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.573 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.573 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.573 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.573 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.573 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.573 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.574 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.575 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.576 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.577 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.578 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.578 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.578 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.578 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.578 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.578 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.579 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.579 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.579 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.579 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.579 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.580 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.580 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.580 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.580 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.580 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.581 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.581 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.581 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.581 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.581 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.582 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.583 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.583 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.583 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.583 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.583 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.583 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.584 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.585 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.585 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.585 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.585 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.585 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.585 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.586 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.586 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.586 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.586 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.587 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.587 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.587 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.587 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.588 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.588 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.588 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.588 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.588 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.589 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.589 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.589 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.589 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.589 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.590 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.590 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.590 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.590 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.590 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.590 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.591 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.592 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.593 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.594 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.594 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.594 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.594 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.594 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.594 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.595 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.596 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.597 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.598 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.598 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.598 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.598 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.598 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.598 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.599 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.600 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.601 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.602 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.603 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.604 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.605 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.606 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.607 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.607 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.607 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.607 2 DEBUG oslo_service.service [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.608 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.626 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.627 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.628 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.628 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  4 01:24:44 np0005470441 systemd[1]: Starting libvirt QEMU daemon...
Oct  4 01:24:44 np0005470441 systemd[1]: Started libvirt QEMU daemon.
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.707 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3b21b11100> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.709 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3b21b11100> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.709 2 INFO nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.734 2 WARNING nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  4 01:24:44 np0005470441 nova_compute[191640]: 2025-10-04 05:24:44.734 2 DEBUG nova.virt.libvirt.volume.mount [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  4 01:24:45 np0005470441 python3.9[192326]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  4 01:24:45 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.506 2 INFO nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Libvirt host capabilities <capabilities>
Oct  4 01:24:45 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <host>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <uuid>2c012175-5984-4641-9ddd-886dcd6e4c6f</uuid>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <arch>x86_64</arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model>EPYC-Rome-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <vendor>AMD</vendor>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <microcode version='16777317'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <signature family='23' model='49' stepping='0'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='x2apic'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='tsc-deadline'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='osxsave'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='hypervisor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='tsc_adjust'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='spec-ctrl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='stibp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='arch-capabilities'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='cmp_legacy'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='topoext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='virt-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='lbrv'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='tsc-scale'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='vmcb-clean'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='pause-filter'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='pfthreshold'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='svme-addr-chk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='rdctl-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='skip-l1dfl-vmentry'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='mds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature name='pschange-mc-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <pages unit='KiB' size='4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <pages unit='KiB' size='2048'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <pages unit='KiB' size='1048576'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <power_management>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <suspend_mem/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <suspend_disk/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <suspend_hybrid/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </power_management>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <iommu support='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <migration_features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <live/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <uri_transports>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <uri_transport>tcp</uri_transport>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <uri_transport>rdma</uri_transport>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </uri_transports>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </migration_features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <topology>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <cells num='1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <cell id='0'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          <memory unit='KiB'>7864096</memory>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          <pages unit='KiB' size='4'>1966024</pages>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          <pages unit='KiB' size='2048'>0</pages>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          <distances>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <sibling id='0' value='10'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          </distances>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          <cpus num='8'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:          </cpus>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        </cell>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </cells>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </topology>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <cache>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </cache>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <secmodel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model>selinux</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <doi>0</doi>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </secmodel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <secmodel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model>dac</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <doi>0</doi>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </secmodel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </host>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <guest>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <os_type>hvm</os_type>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <arch name='i686'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <wordsize>32</wordsize>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <domain type='qemu'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <domain type='kvm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <pae/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <nonpae/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <acpi default='on' toggle='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <apic default='on' toggle='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <cpuselection/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <deviceboot/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <disksnapshot default='on' toggle='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <externalSnapshot/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </guest>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <guest>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <os_type>hvm</os_type>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <arch name='x86_64'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <wordsize>64</wordsize>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <domain type='qemu'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <domain type='kvm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <acpi default='on' toggle='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <apic default='on' toggle='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <cpuselection/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <deviceboot/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <disksnapshot default='on' toggle='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <externalSnapshot/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </guest>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 
Oct  4 01:24:45 np0005470441 nova_compute[191640]: </capabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: #033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.516 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.537 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  4 01:24:45 np0005470441 nova_compute[191640]: <domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <domain>kvm</domain>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <arch>i686</arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <vcpu max='240'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <iothreads supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <os supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='firmware'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <loader supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>rom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pflash</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='readonly'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>yes</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='secure'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </loader>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </os>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='maximumMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <vendor>AMD</vendor>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='succor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='custom' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-128'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-256'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-512'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <memoryBacking supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='sourceType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>file</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>anonymous</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>memfd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </memoryBacking>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <disk supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='diskDevice'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>disk</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cdrom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>floppy</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>lun</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ide</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>fdc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>sata</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </disk>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <graphics supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vnc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egl-headless</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>dbus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </graphics>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <video supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='modelType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vga</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cirrus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>none</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>bochs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ramfb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </video>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hostdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='mode'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>subsystem</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='startupPolicy'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>mandatory</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>requisite</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>optional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='subsysType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pci</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='capsType'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='pciBackend'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hostdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <rng supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>random</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </rng>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <filesystem supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='driverType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>path</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>handle</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtiofs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </filesystem>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <tpm supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-tis</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-crb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emulator</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>external</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendVersion'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>2.0</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </tpm>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <redirdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </redirdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <channel supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pty</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>unix</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </channel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <crypto supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>qemu</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </crypto>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <interface supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>passt</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </interface>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <panic supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>isa</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>hyperv</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </panic>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <gic supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <genid supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backup supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <async-teardown supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <ps2 supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sev supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sgx supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hyperv supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='features'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>relaxed</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vapic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>spinlocks</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vpindex</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>runtime</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>synic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>stimer</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reset</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vendor_id</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>frequencies</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reenlightenment</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tlbflush</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ipi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>avic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emsr_bitmap</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>xmm_input</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hyperv>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <launchSecurity supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: </domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.544 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  4 01:24:45 np0005470441 nova_compute[191640]: <domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <domain>kvm</domain>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <arch>i686</arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <vcpu max='4096'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <iothreads supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <os supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='firmware'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <loader supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>rom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pflash</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='readonly'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>yes</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='secure'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </loader>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </os>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='maximumMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <vendor>AMD</vendor>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='succor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='custom' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-128'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-256'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-512'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <memoryBacking supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='sourceType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>file</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>anonymous</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>memfd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </memoryBacking>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <disk supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='diskDevice'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>disk</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cdrom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>floppy</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>lun</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>fdc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>sata</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </disk>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <graphics supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vnc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egl-headless</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>dbus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </graphics>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <video supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='modelType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vga</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cirrus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>none</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>bochs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ramfb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </video>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hostdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='mode'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>subsystem</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='startupPolicy'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>mandatory</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>requisite</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>optional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='subsysType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pci</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='capsType'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='pciBackend'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hostdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <rng supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>random</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </rng>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <filesystem supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='driverType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>path</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>handle</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtiofs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </filesystem>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <tpm supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-tis</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-crb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emulator</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>external</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendVersion'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>2.0</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </tpm>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <redirdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </redirdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <channel supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pty</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>unix</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </channel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <crypto supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>qemu</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </crypto>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <interface supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>passt</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </interface>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <panic supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>isa</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>hyperv</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </panic>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <gic supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <genid supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backup supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <async-teardown supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <ps2 supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sev supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sgx supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hyperv supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='features'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>relaxed</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vapic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>spinlocks</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vpindex</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>runtime</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>synic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>stimer</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reset</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vendor_id</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>frequencies</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reenlightenment</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tlbflush</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ipi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>avic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emsr_bitmap</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>xmm_input</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hyperv>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <launchSecurity supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: </domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.572 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.576 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  4 01:24:45 np0005470441 nova_compute[191640]: <domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <domain>kvm</domain>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <arch>x86_64</arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <vcpu max='240'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <iothreads supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <os supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='firmware'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <loader supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>rom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pflash</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='readonly'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>yes</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='secure'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </loader>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </os>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='maximumMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <vendor>AMD</vendor>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='succor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='custom' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-128'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-256'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-512'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <memoryBacking supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='sourceType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>file</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>anonymous</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>memfd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </memoryBacking>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <disk supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='diskDevice'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>disk</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cdrom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>floppy</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>lun</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ide</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>fdc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>sata</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </disk>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <graphics supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vnc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egl-headless</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>dbus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </graphics>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <video supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='modelType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vga</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cirrus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>none</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>bochs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ramfb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </video>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hostdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='mode'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>subsystem</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='startupPolicy'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>mandatory</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>requisite</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>optional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='subsysType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pci</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='capsType'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='pciBackend'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hostdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <rng supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>random</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </rng>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <filesystem supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='driverType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>path</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>handle</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtiofs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </filesystem>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <tpm supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-tis</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-crb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emulator</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>external</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendVersion'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>2.0</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </tpm>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <redirdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </redirdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <channel supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pty</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>unix</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </channel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <crypto supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>qemu</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </crypto>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <interface supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>passt</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </interface>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <panic supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>isa</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>hyperv</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </panic>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <gic supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <genid supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backup supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <async-teardown supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <ps2 supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sev supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sgx supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hyperv supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='features'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>relaxed</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vapic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>spinlocks</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vpindex</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>runtime</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>synic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>stimer</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reset</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vendor_id</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>frequencies</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reenlightenment</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tlbflush</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ipi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>avic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emsr_bitmap</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>xmm_input</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hyperv>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <launchSecurity supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: </domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.640 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  4 01:24:45 np0005470441 nova_compute[191640]: <domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <domain>kvm</domain>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <arch>x86_64</arch>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <vcpu max='4096'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <iothreads supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <os supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='firmware'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>efi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <loader supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>rom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pflash</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='readonly'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>yes</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='secure'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>yes</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>no</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </loader>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </os>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='maximumMigratable'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>on</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>off</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <vendor>AMD</vendor>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='succor'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <mode name='custom' supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Denverton-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='auto-ibrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amd-psfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='stibp-always-on'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='EPYC-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-128'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-256'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx10-512'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='prefetchiti'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Haswell-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512er'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512pf'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fma4'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tbm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xop'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='amx-tile'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-bf16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-fp16'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bitalg'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrc'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fzrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='la57'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='taa-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xfd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ifma'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cmpccxadd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fbsdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='fsrs'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ibrs-all'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mcdt-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pbrsb-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='psdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='serialize'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vaes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='hle'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='rtm'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512bw'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512cd'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512dq'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512f'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='avx512vl'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='invpcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pcid'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='pku'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='mpx'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='core-capability'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='split-lock-detect'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='cldemote'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='erms'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='gfni'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdir64b'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='movdiri'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='xsaves'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='athlon-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='core2duo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='coreduo-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='n270-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='ss'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <blockers model='phenom-v1'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnow'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <feature name='3dnowext'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </blockers>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </mode>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <memoryBacking supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <enum name='sourceType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>file</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>anonymous</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <value>memfd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </memoryBacking>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <disk supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='diskDevice'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>disk</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cdrom</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>floppy</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>lun</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>fdc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>sata</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </disk>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <graphics supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vnc</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egl-headless</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>dbus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </graphics>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <video supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='modelType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vga</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>cirrus</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>none</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>bochs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ramfb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </video>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hostdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='mode'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>subsystem</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='startupPolicy'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>mandatory</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>requisite</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>optional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='subsysType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pci</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>scsi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='capsType'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='pciBackend'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hostdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <rng supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtio-non-transitional</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>random</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>egd</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </rng>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <filesystem supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='driverType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>path</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>handle</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>virtiofs</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </filesystem>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <tpm supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-tis</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tpm-crb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emulator</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>external</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendVersion'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>2.0</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </tpm>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <redirdev supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='bus'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>usb</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </redirdev>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <channel supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>pty</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>unix</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </channel>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <crypto supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='type'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>qemu</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendModel'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>builtin</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </crypto>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <interface supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='backendType'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>default</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>passt</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </interface>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <panic supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='model'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>isa</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>hyperv</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </panic>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </devices>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <gic supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <genid supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <backup supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <async-teardown supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <ps2 supported='yes'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sev supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <sgx supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <hyperv supported='yes'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      <enum name='features'>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>relaxed</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vapic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>spinlocks</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vpindex</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>runtime</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>synic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>stimer</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reset</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>vendor_id</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>frequencies</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>reenlightenment</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>tlbflush</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>ipi</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>avic</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>emsr_bitmap</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:        <value>xmm_input</value>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:      </enum>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    </hyperv>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:    <launchSecurity supported='no'/>
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  </features>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: </domainCapabilities>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.708 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.708 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.708 2 DEBUG nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.709 2 INFO nova.virt.libvirt.host [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Secure Boot support detected#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.710 2 INFO nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.711 2 INFO nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.721 2 DEBUG nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  4 01:24:45 np0005470441 nova_compute[191640]:  <model>Nehalem</model>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: </cpu>
Oct  4 01:24:45 np0005470441 nova_compute[191640]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.725 2 DEBUG nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.763 2 INFO nova.virt.node [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Determined node identity 4baba3a8-b392-49ca-9421-92d7b50a939b from /var/lib/nova/compute_id#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.782 2 WARNING nova.compute.manager [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Compute nodes ['4baba3a8-b392-49ca-9421-92d7b50a939b'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.849 2 INFO nova.compute.manager [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.906 2 WARNING nova.compute.manager [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.906 2 DEBUG oslo_concurrency.lockutils [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.906 2 DEBUG oslo_concurrency.lockutils [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.907 2 DEBUG oslo_concurrency.lockutils [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:24:45 np0005470441 nova_compute[191640]: 2025-10-04 05:24:45.907 2 DEBUG nova.compute.resource_tracker [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:24:45 np0005470441 systemd[1]: Starting libvirt nodedev daemon...
Oct  4 01:24:45 np0005470441 systemd[1]: Started libvirt nodedev daemon.
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.224 2 WARNING nova.virt.libvirt.driver [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.226 2 DEBUG nova.compute.resource_tracker [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6211MB free_disk=73.66966247558594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.227 2 DEBUG oslo_concurrency.lockutils [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.227 2 DEBUG oslo_concurrency.lockutils [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.248 2 WARNING nova.compute.resource_tracker [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] No compute node record for compute-1.ctlplane.example.com:4baba3a8-b392-49ca-9421-92d7b50a939b: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 4baba3a8-b392-49ca-9421-92d7b50a939b could not be found.#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.271 2 INFO nova.compute.resource_tracker [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 4baba3a8-b392-49ca-9421-92d7b50a939b#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.377 2 DEBUG nova.compute.resource_tracker [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.378 2 DEBUG nova.compute.resource_tracker [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:24:46 np0005470441 python3.9[192536]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:24:46 np0005470441 systemd[1]: Stopping nova_compute container...
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.566 2 DEBUG oslo_concurrency.lockutils [None req-63e08308-f524-4cb5-a419-870badad9124 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.566 2 DEBUG oslo_concurrency.lockutils [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.567 2 DEBUG oslo_concurrency.lockutils [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:24:46 np0005470441 nova_compute[191640]: 2025-10-04 05:24:46.567 2 DEBUG oslo_concurrency.lockutils [None req-a7c70ae4-70ea-4a37-81c7-6060998ef2c1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:24:46 np0005470441 podman[192538]: 2025-10-04 05:24:46.631989412 +0000 UTC m=+0.121113388 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:24:47 np0005470441 virtqemud[192168]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  4 01:24:47 np0005470441 virtqemud[192168]: hostname: compute-1
Oct  4 01:24:47 np0005470441 virtqemud[192168]: End of file while reading data: Input/output error
Oct  4 01:24:47 np0005470441 systemd[1]: libpod-70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a.scope: Deactivated successfully.
Oct  4 01:24:47 np0005470441 systemd[1]: libpod-70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a.scope: Consumed 3.252s CPU time.
Oct  4 01:24:47 np0005470441 podman[192546]: 2025-10-04 05:24:47.018296606 +0000 UTC m=+0.489217844 container died 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Oct  4 01:24:47 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a-userdata-shm.mount: Deactivated successfully.
Oct  4 01:24:47 np0005470441 systemd[1]: var-lib-containers-storage-overlay-f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390-merged.mount: Deactivated successfully.
Oct  4 01:24:47 np0005470441 podman[192546]: 2025-10-04 05:24:47.142663235 +0000 UTC m=+0.613584473 container cleanup 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:24:47 np0005470441 podman[192546]: nova_compute
Oct  4 01:24:47 np0005470441 podman[192598]: nova_compute
Oct  4 01:24:47 np0005470441 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  4 01:24:47 np0005470441 systemd[1]: Stopped nova_compute container.
Oct  4 01:24:47 np0005470441 systemd[1]: Starting nova_compute container...
Oct  4 01:24:47 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:24:47 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:47 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:47 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:47 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:47 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21acf98b639d22e71105d63d4127660b3525c208514cdd49e93f24d10521390/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:47 np0005470441 podman[192611]: 2025-10-04 05:24:47.384445206 +0000 UTC m=+0.141714474 container init 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3)
Oct  4 01:24:47 np0005470441 podman[192611]: 2025-10-04 05:24:47.390528179 +0000 UTC m=+0.147797427 container start 70b9ba43b62298bfda245b1c846e3f152d7e19106e7fa29aaff849ca06d6316a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + sudo -E kolla_set_configs
Oct  4 01:24:47 np0005470441 podman[192611]: nova_compute
Oct  4 01:24:47 np0005470441 systemd[1]: Started nova_compute container.
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Validating config file
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying service configuration files
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /etc/ceph
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Creating directory /etc/ceph
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /etc/ceph
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Writing out command to execute
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:47 np0005470441 nova_compute[192626]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  4 01:24:47 np0005470441 nova_compute[192626]: ++ cat /run_command
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + CMD=nova-compute
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + ARGS=
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + sudo kolla_copy_cacerts
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + [[ ! -n '' ]]
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + . kolla_extend_start
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + echo 'Running command: '\''nova-compute'\'''
Oct  4 01:24:47 np0005470441 nova_compute[192626]: Running command: 'nova-compute'
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + umask 0022
Oct  4 01:24:47 np0005470441 nova_compute[192626]: + exec nova-compute
Oct  4 01:24:48 np0005470441 python3.9[192790]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  4 01:24:49 np0005470441 systemd[1]: Started libpod-conmon-f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd.scope.
Oct  4 01:24:49 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:24:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04a0aa24092a033094efba1e6ba2444929fad0230421262502972ecaf83bd11d/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04a0aa24092a033094efba1e6ba2444929fad0230421262502972ecaf83bd11d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/04a0aa24092a033094efba1e6ba2444929fad0230421262502972ecaf83bd11d/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  4 01:24:49 np0005470441 podman[192814]: 2025-10-04 05:24:49.226714374 +0000 UTC m=+0.230637454 container init f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  4 01:24:49 np0005470441 podman[192814]: 2025-10-04 05:24:49.234363332 +0000 UTC m=+0.238286422 container start f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  4 01:24:49 np0005470441 python3.9[192790]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Applying nova statedir ownership
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  4 01:24:49 np0005470441 nova_compute_init[192835]: INFO:nova_statedir:Nova statedir ownership complete
Oct  4 01:24:49 np0005470441 systemd[1]: libpod-f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd.scope: Deactivated successfully.
Oct  4 01:24:49 np0005470441 podman[192836]: 2025-10-04 05:24:49.317779436 +0000 UTC m=+0.022300836 container died f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:24:49 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd-userdata-shm.mount: Deactivated successfully.
Oct  4 01:24:49 np0005470441 systemd[1]: var-lib-containers-storage-overlay-04a0aa24092a033094efba1e6ba2444929fad0230421262502972ecaf83bd11d-merged.mount: Deactivated successfully.
Oct  4 01:24:49 np0005470441 nova_compute[192626]: 2025-10-04 05:24:49.546 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  4 01:24:49 np0005470441 nova_compute[192626]: 2025-10-04 05:24:49.546 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  4 01:24:49 np0005470441 nova_compute[192626]: 2025-10-04 05:24:49.547 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  4 01:24:49 np0005470441 nova_compute[192626]: 2025-10-04 05:24:49.547 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  4 01:24:49 np0005470441 podman[192847]: 2025-10-04 05:24:49.606612136 +0000 UTC m=+0.282894792 container cleanup f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:24:49 np0005470441 systemd[1]: libpod-conmon-f9b66eaada7517dbb714aabcb8013769fb8dfb07172293baa172919de4b4c6cd.scope: Deactivated successfully.
Oct  4 01:24:49 np0005470441 nova_compute[192626]: 2025-10-04 05:24:49.691 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:24:49 np0005470441 nova_compute[192626]: 2025-10-04 05:24:49.706 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.129 2 INFO nova.virt.driver [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.224 2 INFO nova.compute.provider_config [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.234 2 DEBUG oslo_concurrency.lockutils [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.234 2 DEBUG oslo_concurrency.lockutils [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.234 2 DEBUG oslo_concurrency.lockutils [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.234 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.235 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.235 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.235 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.235 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.235 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.235 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.236 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.237 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.237 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.237 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.237 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.237 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.237 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.238 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.239 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.240 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.240 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.240 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.240 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.240 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.240 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.241 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.242 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.243 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.243 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.243 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.243 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.243 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.243 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.244 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.245 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.246 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.246 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.246 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.246 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.246 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.246 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.247 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.248 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.249 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.250 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.251 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.252 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.253 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.254 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.255 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.256 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.257 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.258 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.259 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.260 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.261 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.262 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.263 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.264 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.265 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.266 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.267 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.268 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.269 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.270 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.271 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.271 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.271 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.271 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.271 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.271 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.272 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.273 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.274 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.275 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.276 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.277 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.278 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.279 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.280 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.281 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.282 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.283 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.284 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.285 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.285 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.285 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.285 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.285 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.285 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.286 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.287 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.288 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.289 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.290 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.291 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.291 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.291 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.291 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.291 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.291 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.292 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.293 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.294 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.295 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.296 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.297 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.298 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.299 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.300 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.301 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.302 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.303 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.303 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.303 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.303 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.303 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.303 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.304 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.304 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.304 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.304 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.304 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.304 2 WARNING oslo_config.cfg [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  4 01:24:50 np0005470441 nova_compute[192626]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  4 01:24:50 np0005470441 nova_compute[192626]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  4 01:24:50 np0005470441 nova_compute[192626]: and ``live_migration_inbound_addr`` respectively.
Oct  4 01:24:50 np0005470441 nova_compute[192626]: ).  Its value may be silently ignored in the future.#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.305 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.306 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.307 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.307 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.307 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.307 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.307 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.307 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.308 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.309 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.310 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.311 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.312 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.312 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.312 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.312 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.312 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.312 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.313 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.314 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.315 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.315 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.315 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.315 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.315 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.315 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.316 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.317 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.317 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.317 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.317 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.317 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.317 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.318 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.319 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.320 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.321 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.322 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.323 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.323 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.323 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.323 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.323 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.324 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.325 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.325 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.325 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.325 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.325 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.325 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.326 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.327 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.327 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.327 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.327 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.327 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.327 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.328 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.329 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.329 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.329 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.329 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.329 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.329 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.330 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.331 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.331 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.331 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.331 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.331 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.331 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.332 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.333 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.334 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.334 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.334 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.334 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.334 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.334 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.335 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.335 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.335 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.335 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.335 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.335 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.336 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.337 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.337 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.337 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.337 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.337 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.337 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.338 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.339 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.339 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.339 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.339 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.339 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.339 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.340 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.340 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.340 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.340 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.340 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.340 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.341 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.342 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.342 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.342 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.342 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.342 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.343 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.343 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.343 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.343 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.343 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.343 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.344 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.345 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.346 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.346 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.346 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.346 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.346 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.346 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.347 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.348 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.348 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.348 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.348 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.348 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.348 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.349 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.350 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.351 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.351 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.351 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.351 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.351 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.351 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.352 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.352 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.352 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.352 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.352 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.352 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.353 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.354 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.355 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.355 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.355 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.355 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.355 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.355 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.356 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.356 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.356 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.356 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.356 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.356 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.357 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.358 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.359 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.359 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.359 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.359 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.359 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.359 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.360 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.361 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.361 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.361 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.361 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.361 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.361 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.362 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.363 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.363 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.363 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.363 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.363 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.363 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.364 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.365 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.366 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.367 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.367 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.367 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.367 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.367 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.367 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.368 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.368 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.368 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.368 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.368 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.368 2 DEBUG oslo_service.service [None req-34df9ae3-039e-44ac-9513-5d287979b38a - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.369 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.387 2 INFO nova.virt.node [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Determined node identity 4baba3a8-b392-49ca-9421-92d7b50a939b from /var/lib/nova/compute_id#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.388 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.388 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.388 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.388 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.398 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f54237f43d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.401 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f54237f43d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.401 2 INFO nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.407 2 INFO nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Libvirt host capabilities <capabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <host>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <uuid>2c012175-5984-4641-9ddd-886dcd6e4c6f</uuid>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <arch>x86_64</arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model>EPYC-Rome-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <vendor>AMD</vendor>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <microcode version='16777317'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <signature family='23' model='49' stepping='0'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='x2apic'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='tsc-deadline'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='osxsave'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='hypervisor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='tsc_adjust'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='spec-ctrl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='stibp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='arch-capabilities'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='cmp_legacy'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='topoext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='virt-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='lbrv'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='tsc-scale'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='vmcb-clean'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='pause-filter'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='pfthreshold'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='svme-addr-chk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='rdctl-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='skip-l1dfl-vmentry'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='mds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature name='pschange-mc-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <pages unit='KiB' size='4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <pages unit='KiB' size='2048'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <pages unit='KiB' size='1048576'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <power_management>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <suspend_mem/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <suspend_disk/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <suspend_hybrid/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </power_management>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <iommu support='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <migration_features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <live/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <uri_transports>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <uri_transport>tcp</uri_transport>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <uri_transport>rdma</uri_transport>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </uri_transports>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </migration_features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <topology>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <cells num='1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <cell id='0'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          <memory unit='KiB'>7864096</memory>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          <pages unit='KiB' size='4'>1966024</pages>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          <pages unit='KiB' size='2048'>0</pages>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          <distances>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <sibling id='0' value='10'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          </distances>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          <cpus num='8'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:          </cpus>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        </cell>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </cells>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </topology>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <cache>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </cache>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <secmodel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model>selinux</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <doi>0</doi>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </secmodel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <secmodel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model>dac</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <doi>0</doi>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </secmodel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </host>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <guest>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <os_type>hvm</os_type>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <arch name='i686'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <wordsize>32</wordsize>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <domain type='qemu'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <domain type='kvm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <pae/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <nonpae/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <acpi default='on' toggle='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <apic default='on' toggle='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <cpuselection/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <deviceboot/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <disksnapshot default='on' toggle='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <externalSnapshot/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </guest>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <guest>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <os_type>hvm</os_type>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <arch name='x86_64'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <wordsize>64</wordsize>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <domain type='qemu'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <domain type='kvm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <acpi default='on' toggle='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <apic default='on' toggle='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <cpuselection/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <deviceboot/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <disksnapshot default='on' toggle='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <externalSnapshot/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </guest>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 
Oct  4 01:24:50 np0005470441 nova_compute[192626]: </capabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: #033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.417 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.419 2 DEBUG nova.virt.libvirt.volume.mount [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.425 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  4 01:24:50 np0005470441 nova_compute[192626]: <domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <domain>kvm</domain>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <arch>i686</arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <vcpu max='240'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <iothreads supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <os supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='firmware'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <loader supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>rom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pflash</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='readonly'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>yes</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='secure'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </loader>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='maximumMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <vendor>AMD</vendor>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='succor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='custom' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-128'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-256'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-512'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <memoryBacking supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='sourceType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>file</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>anonymous</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>memfd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </memoryBacking>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <disk supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='diskDevice'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>disk</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cdrom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>floppy</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>lun</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ide</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>fdc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>sata</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <graphics supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vnc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egl-headless</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>dbus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <video supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='modelType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vga</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cirrus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>none</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>bochs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ramfb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hostdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='mode'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>subsystem</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='startupPolicy'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>mandatory</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>requisite</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>optional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='subsysType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pci</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='capsType'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='pciBackend'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hostdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <rng supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>random</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <filesystem supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='driverType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>path</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>handle</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtiofs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </filesystem>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <tpm supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-tis</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-crb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emulator</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>external</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendVersion'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>2.0</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </tpm>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <redirdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </redirdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <channel supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pty</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>unix</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </channel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <crypto supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>qemu</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </crypto>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <interface supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>passt</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <panic supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>isa</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>hyperv</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </panic>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <gic supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <genid supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backup supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <async-teardown supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <ps2 supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sev supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sgx supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hyperv supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='features'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>relaxed</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vapic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>spinlocks</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vpindex</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>runtime</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>synic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>stimer</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reset</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vendor_id</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>frequencies</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reenlightenment</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tlbflush</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ipi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>avic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emsr_bitmap</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>xmm_input</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hyperv>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <launchSecurity supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: </domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.432 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  4 01:24:50 np0005470441 nova_compute[192626]: <domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <domain>kvm</domain>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <arch>i686</arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <vcpu max='4096'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <iothreads supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <os supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='firmware'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <loader supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>rom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pflash</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='readonly'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>yes</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='secure'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </loader>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='maximumMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <vendor>AMD</vendor>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='succor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='custom' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-128'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-256'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-512'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <memoryBacking supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='sourceType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>file</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>anonymous</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>memfd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </memoryBacking>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <disk supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='diskDevice'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>disk</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cdrom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>floppy</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>lun</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>fdc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>sata</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <graphics supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vnc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egl-headless</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>dbus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <video supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='modelType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vga</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cirrus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>none</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>bochs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ramfb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hostdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='mode'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>subsystem</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='startupPolicy'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>mandatory</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>requisite</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>optional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='subsysType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pci</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='capsType'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='pciBackend'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hostdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <rng supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>random</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <filesystem supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='driverType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>path</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>handle</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtiofs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </filesystem>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <tpm supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-tis</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-crb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emulator</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>external</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendVersion'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>2.0</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </tpm>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <redirdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </redirdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <channel supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pty</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>unix</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </channel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <crypto supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>qemu</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </crypto>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <interface supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>passt</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <panic supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>isa</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>hyperv</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </panic>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <gic supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <genid supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backup supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <async-teardown supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <ps2 supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sev supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sgx supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hyperv supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='features'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>relaxed</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vapic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>spinlocks</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vpindex</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>runtime</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>synic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>stimer</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reset</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vendor_id</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>frequencies</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reenlightenment</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tlbflush</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ipi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>avic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emsr_bitmap</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>xmm_input</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hyperv>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <launchSecurity supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: </domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.458 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.461 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  4 01:24:50 np0005470441 nova_compute[192626]: <domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <domain>kvm</domain>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <arch>x86_64</arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <vcpu max='4096'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <iothreads supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <os supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='firmware'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>efi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <loader supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>rom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pflash</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='readonly'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>yes</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='secure'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>yes</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </loader>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='maximumMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <vendor>AMD</vendor>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='succor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='custom' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-128'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-256'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-512'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <memoryBacking supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='sourceType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>file</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>anonymous</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>memfd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </memoryBacking>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <disk supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='diskDevice'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>disk</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cdrom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>floppy</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>lun</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>fdc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>sata</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <graphics supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vnc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egl-headless</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>dbus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <video supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='modelType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vga</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cirrus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>none</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>bochs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ramfb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hostdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='mode'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>subsystem</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='startupPolicy'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>mandatory</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>requisite</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>optional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='subsysType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pci</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='capsType'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='pciBackend'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hostdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <rng supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>random</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <filesystem supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='driverType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>path</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>handle</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtiofs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </filesystem>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <tpm supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-tis</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-crb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emulator</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>external</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendVersion'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>2.0</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </tpm>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <redirdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </redirdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <channel supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pty</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>unix</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </channel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <crypto supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>qemu</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </crypto>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <interface supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>passt</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <panic supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>isa</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>hyperv</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </panic>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <gic supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <genid supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backup supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <async-teardown supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <ps2 supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sev supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sgx supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hyperv supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='features'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>relaxed</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vapic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>spinlocks</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vpindex</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>runtime</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>synic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>stimer</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reset</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vendor_id</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>frequencies</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reenlightenment</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tlbflush</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ipi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>avic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emsr_bitmap</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>xmm_input</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hyperv>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <launchSecurity supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: </domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.530 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  4 01:24:50 np0005470441 nova_compute[192626]: <domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <path>/usr/libexec/qemu-kvm</path>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <domain>kvm</domain>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <arch>x86_64</arch>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <vcpu max='240'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <iothreads supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <os supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='firmware'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <loader supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>rom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pflash</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='readonly'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>yes</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='secure'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>no</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </loader>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-passthrough' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='hostPassthroughMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='maximum' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='maximumMigratable'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>on</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>off</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='host-model' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <vendor>AMD</vendor>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='x2apic'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-deadline'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='hypervisor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc_adjust'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='spec-ctrl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='stibp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='arch-capabilities'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='cmp_legacy'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='overflow-recov'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='succor'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='amd-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='virt-ssbd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lbrv'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='tsc-scale'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='vmcb-clean'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='flushbyasid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pause-filter'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pfthreshold'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='svme-addr-chk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rdctl-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='mds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='pschange-mc-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='gds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='require' name='rfds-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <feature policy='disable' name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <mode name='custom' supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Broadwell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cascadelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Cooperlake-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Denverton-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Dhyana-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Genoa-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='auto-ibrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Milan-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amd-psfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='no-nested-data-bp'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='null-sel-clr-base'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='stibp-always-on'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-Rome-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='EPYC-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='GraniteRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-128'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-256'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx10-512'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='prefetchiti'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Haswell-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-noTSX'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v6'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Icelake-Server-v7'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='IvyBridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='KnightsMill-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4fmaps'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-4vnniw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512er'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512pf'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G4-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Opteron_G5-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fma4'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tbm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xop'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SapphireRapids-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='amx-tile'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-bf16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-fp16'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512-vpopcntdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bitalg'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vbmi2'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrc'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fzrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='la57'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='taa-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='tsx-ldtrk'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xfd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='SierraForest-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ifma'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-ne-convert'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx-vnni-int8'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='bus-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cmpccxadd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fbsdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='fsrs'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ibrs-all'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mcdt-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pbrsb-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='psdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='sbdr-ssdp-no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='serialize'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vaes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='vpclmulqdq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Client-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='hle'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='rtm'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Skylake-Server-v5'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512bw'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512cd'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512dq'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512f'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='avx512vl'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='invpcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pcid'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='pku'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='mpx'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v2'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v3'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='core-capability'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='split-lock-detect'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='Snowridge-v4'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='cldemote'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='erms'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='gfni'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdir64b'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='movdiri'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='xsaves'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='athlon-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='core2duo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='coreduo-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='n270-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='ss'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <blockers model='phenom-v1'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnow'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <feature name='3dnowext'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </blockers>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </mode>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <memoryBacking supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <enum name='sourceType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>file</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>anonymous</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <value>memfd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </memoryBacking>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <disk supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='diskDevice'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>disk</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cdrom</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>floppy</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>lun</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ide</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>fdc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>sata</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <graphics supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vnc</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egl-headless</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>dbus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <video supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='modelType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vga</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>cirrus</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>none</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>bochs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ramfb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hostdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='mode'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>subsystem</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='startupPolicy'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>mandatory</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>requisite</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>optional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='subsysType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pci</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>scsi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='capsType'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='pciBackend'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hostdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <rng supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtio-non-transitional</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>random</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>egd</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <filesystem supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='driverType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>path</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>handle</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>virtiofs</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </filesystem>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <tpm supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-tis</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tpm-crb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emulator</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>external</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendVersion'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>2.0</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </tpm>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <redirdev supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='bus'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>usb</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </redirdev>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <channel supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>pty</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>unix</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </channel>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <crypto supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='type'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>qemu</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendModel'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>builtin</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </crypto>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <interface supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='backendType'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>default</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>passt</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <panic supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='model'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>isa</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>hyperv</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </panic>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <gic supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <vmcoreinfo supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <genid supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backingStoreInput supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <backup supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <async-teardown supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <ps2 supported='yes'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sev supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <sgx supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <hyperv supported='yes'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      <enum name='features'>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>relaxed</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vapic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>spinlocks</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vpindex</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>runtime</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>synic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>stimer</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reset</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>vendor_id</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>frequencies</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>reenlightenment</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>tlbflush</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>ipi</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>avic</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>emsr_bitmap</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:        <value>xmm_input</value>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:      </enum>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    </hyperv>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:    <launchSecurity supported='no'/>
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: </domainCapabilities>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.598 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.598 2 INFO nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Secure Boot support detected#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.600 2 INFO nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.601 2 INFO nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.610 2 DEBUG nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] cpu compare xml: <cpu match="exact">
Oct  4 01:24:50 np0005470441 nova_compute[192626]:  <model>Nehalem</model>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: </cpu>
Oct  4 01:24:50 np0005470441 nova_compute[192626]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.612 2 DEBUG nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.639 2 INFO nova.virt.node [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Determined node identity 4baba3a8-b392-49ca-9421-92d7b50a939b from /var/lib/nova/compute_id#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.662 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Verified node 4baba3a8-b392-49ca-9421-92d7b50a939b matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.685 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.739 2 ERROR nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Could not retrieve compute node resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b and therefore unable to error out any instances stuck in BUILDING state. Error: Failed to retrieve allocations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '4baba3a8-b392-49ca-9421-92d7b50a939b' not found: No resource provider with uuid 4baba3a8-b392-49ca-9421-92d7b50a939b found  ", "request_id": "req-7420c94d-83c8-4fd8-aa19-c87d6324a0c9"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '4baba3a8-b392-49ca-9421-92d7b50a939b' not found: No resource provider with uuid 4baba3a8-b392-49ca-9421-92d7b50a939b found  ", "request_id": "req-7420c94d-83c8-4fd8-aa19-c87d6324a0c9"}]}#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.759 2 DEBUG oslo_concurrency.lockutils [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.759 2 DEBUG oslo_concurrency.lockutils [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.759 2 DEBUG oslo_concurrency.lockutils [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.760 2 DEBUG nova.compute.resource_tracker [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:24:50 np0005470441 systemd[1]: session-26.scope: Deactivated successfully.
Oct  4 01:24:50 np0005470441 systemd[1]: session-26.scope: Consumed 2min 14.619s CPU time.
Oct  4 01:24:50 np0005470441 systemd-logind[796]: Session 26 logged out. Waiting for processes to exit.
Oct  4 01:24:50 np0005470441 systemd-logind[796]: Removed session 26.
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.932 2 WARNING nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.933 2 DEBUG nova.compute.resource_tracker [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6164MB free_disk=73.66824340820312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.933 2 DEBUG oslo_concurrency.lockutils [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:24:50 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.934 2 DEBUG oslo_concurrency.lockutils [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.999 2 ERROR nova.compute.resource_tracker [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Skipping removal of allocations for deleted instances: Failed to retrieve allocations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '4baba3a8-b392-49ca-9421-92d7b50a939b' not found: No resource provider with uuid 4baba3a8-b392-49ca-9421-92d7b50a939b found  ", "request_id": "req-0e5beb96-ddc4-48ab-a516-2019fd1b7e0f"}]}: nova.exception.ResourceProviderAllocationRetrievalFailed: Failed to retrieve allocations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b: {"errors": [{"status": 404, "title": "Not Found", "detail": "The resource could not be found.\n\n Resource provider '4baba3a8-b392-49ca-9421-92d7b50a939b' not found: No resource provider with uuid 4baba3a8-b392-49ca-9421-92d7b50a939b found  ", "request_id": "req-0e5beb96-ddc4-48ab-a516-2019fd1b7e0f"}]}#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:50.999 2 DEBUG nova.compute.resource_tracker [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.000 2 DEBUG nova.compute.resource_tracker [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.108 2 INFO nova.scheduler.client.report [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [req-19adb8ce-5282-45a3-9e07-840e00941e02] Created resource provider record via placement API for resource provider with UUID 4baba3a8-b392-49ca-9421-92d7b50a939b and name compute-1.ctlplane.example.com.#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.145 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  4 01:24:51 np0005470441 nova_compute[192626]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.146 2 INFO nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.147 2 DEBUG nova.compute.provider_tree [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.147 2 DEBUG nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.150 2 DEBUG nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Libvirt baseline CPU <cpu>
Oct  4 01:24:51 np0005470441 nova_compute[192626]:  <arch>x86_64</arch>
Oct  4 01:24:51 np0005470441 nova_compute[192626]:  <model>Nehalem</model>
Oct  4 01:24:51 np0005470441 nova_compute[192626]:  <vendor>AMD</vendor>
Oct  4 01:24:51 np0005470441 nova_compute[192626]:  <topology sockets="8" cores="1" threads="1"/>
Oct  4 01:24:51 np0005470441 nova_compute[192626]: </cpu>
Oct  4 01:24:51 np0005470441 nova_compute[192626]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.269 2 DEBUG nova.scheduler.client.report [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Updated inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.270 2 DEBUG nova.compute.provider_tree [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Updating resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.270 2 DEBUG nova.compute.provider_tree [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.344 2 DEBUG nova.compute.provider_tree [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Updating resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.371 2 DEBUG nova.compute.resource_tracker [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.371 2 DEBUG oslo_concurrency.lockutils [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.372 2 DEBUG nova.service [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.456 2 DEBUG nova.service [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  4 01:24:51 np0005470441 nova_compute[192626]: 2025-10-04 05:24:51.457 2 DEBUG nova.servicegroup.drivers.db [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  4 01:24:55 np0005470441 podman[192924]: 2025-10-04 05:24:55.313283531 +0000 UTC m=+0.065469644 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  4 01:24:56 np0005470441 systemd-logind[796]: New session 29 of user zuul.
Oct  4 01:24:56 np0005470441 systemd[1]: Started Session 29 of User zuul.
Oct  4 01:24:57 np0005470441 python3.9[193097]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  4 01:24:59 np0005470441 python3.9[193253]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:24:59 np0005470441 systemd[1]: Reloading.
Oct  4 01:24:59 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:24:59 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:25:00 np0005470441 podman[193411]: 2025-10-04 05:25:00.046213636 +0000 UTC m=+0.062039277 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  4 01:25:00 np0005470441 python3.9[193450]: ansible-ansible.builtin.service_facts Invoked
Oct  4 01:25:00 np0005470441 network[193475]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  4 01:25:00 np0005470441 network[193476]: 'network-scripts' will be removed from distribution in near future.
Oct  4 01:25:00 np0005470441 network[193477]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  4 01:25:04 np0005470441 python3.9[193754]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:25:05 np0005470441 python3.9[193907]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:05 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:25:06 np0005470441 python3.9[194060]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:25:06.729 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:25:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:25:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:25:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:25:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:25:07 np0005470441 python3.9[194212]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:25:08 np0005470441 python3.9[194364]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  4 01:25:09 np0005470441 python3.9[194516]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:25:09 np0005470441 systemd[1]: Reloading.
Oct  4 01:25:09 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:25:09 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:25:10 np0005470441 python3.9[194703]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:25:11 np0005470441 python3.9[194856]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:25:12 np0005470441 python3.9[195006]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:25:13 np0005470441 python3.9[195158]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:14 np0005470441 python3.9[195279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555512.8612258-360-275964099483528/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:25:14 np0005470441 nova_compute[192626]: 2025-10-04 05:25:14.460 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:14 np0005470441 nova_compute[192626]: 2025-10-04 05:25:14.542 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:14 np0005470441 podman[195403]: 2025-10-04 05:25:14.845559119 +0000 UTC m=+0.061551632 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:25:15 np0005470441 python3.9[195450]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct  4 01:25:16 np0005470441 python3.9[195603]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct  4 01:25:17 np0005470441 podman[195728]: 2025-10-04 05:25:17.11725472 +0000 UTC m=+0.126758149 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  4 01:25:17 np0005470441 python3.9[195771]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  4 01:25:19 np0005470441 python3.9[195940]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  4 01:25:23 np0005470441 python3.9[196098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:23 np0005470441 python3.9[196219]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759555522.7605555-564-189275480335644/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:24 np0005470441 python3.9[196369]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:24 np0005470441 python3.9[196490]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759555523.9031105-564-2045604262282/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:25 np0005470441 podman[196614]: 2025-10-04 05:25:25.446361706 +0000 UTC m=+0.069764166 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:25:25 np0005470441 python3.9[196652]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:25 np0005470441 auditd[699]: Audit daemon rotating log files
Oct  4 01:25:26 np0005470441 python3.9[196781]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759555525.13995-564-58529506254695/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:26 np0005470441 python3.9[196931]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:25:27 np0005470441 python3.9[197083]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:25:28 np0005470441 python3.9[197235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:29 np0005470441 python3.9[197356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555527.9693494-741-119686431156870/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:29 np0005470441 python3.9[197506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:30 np0005470441 python3.9[197582]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:30 np0005470441 podman[197583]: 2025-10-04 05:25:30.301941271 +0000 UTC m=+0.062290354 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:25:30 np0005470441 python3.9[197752]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:31 np0005470441 python3.9[197873]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555530.4024627-741-234229523907331/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:32 np0005470441 python3.9[198023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:32 np0005470441 python3.9[198144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555531.622154-741-101065217459709/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:33 np0005470441 python3.9[198294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:33 np0005470441 python3.9[198415]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555532.7403357-741-255994746014899/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:34 np0005470441 python3.9[198565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:35 np0005470441 python3.9[198686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555533.94933-741-162271553888657/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:35 np0005470441 python3.9[198836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:36 np0005470441 python3.9[198957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555535.2297952-741-49256521086646/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:36 np0005470441 python3.9[199107]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:37 np0005470441 python3.9[199228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555536.5010483-741-132983279258675/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:38 np0005470441 python3.9[199378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:38 np0005470441 python3.9[199499]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555537.922696-741-248425158007042/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:39 np0005470441 python3.9[199649]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:40 np0005470441 python3.9[199770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555539.1363163-741-68045540672700/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:40 np0005470441 python3.9[199920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:41 np0005470441 python3.9[200041]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555540.3565075-741-216427032725278/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:42 np0005470441 python3.9[200191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:42 np0005470441 python3.9[200267]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:43 np0005470441 python3.9[200417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:43 np0005470441 python3.9[200493]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:44 np0005470441 python3.9[200643]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:45 np0005470441 python3.9[200719]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:45 np0005470441 podman[200720]: 2025-10-04 05:25:45.118385941 +0000 UTC m=+0.057094417 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  4 01:25:45 np0005470441 python3.9[200891]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:46 np0005470441 python3.9[201043]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:47 np0005470441 python3.9[201195]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:25:47 np0005470441 podman[201196]: 2025-10-04 05:25:47.354449825 +0000 UTC m=+0.105722200 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:25:48 np0005470441 python3.9[201373]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:25:48 np0005470441 systemd[1]: Reloading.
Oct  4 01:25:48 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:25:48 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:25:48 np0005470441 systemd[1]: Listening on Podman API Socket.
Oct  4 01:25:49 np0005470441 python3.9[201564]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.719 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.720 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.720 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.750 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.751 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.751 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.751 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.752 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.752 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.752 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.752 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.753 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.885 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.885 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.885 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:25:49 np0005470441 nova_compute[192626]: 2025-10-04 05:25:49.885 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.036 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.037 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6159MB free_disk=73.66873931884766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.038 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.038 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.110 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.110 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.136 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.170 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.171 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:25:50 np0005470441 nova_compute[192626]: 2025-10-04 05:25:50.171 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:25:50 np0005470441 python3.9[201687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555549.125914-1407-69596059398629/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:25:50 np0005470441 python3.9[201763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:25:51 np0005470441 python3.9[201886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555549.125914-1407-69596059398629/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:25:52 np0005470441 python3.9[202038]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct  4 01:25:53 np0005470441 python3.9[202190]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:25:54 np0005470441 python3[202342]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:25:54 np0005470441 podman[202379]: 2025-10-04 05:25:54.860917859 +0000 UTC m=+0.078822841 container create a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:25:54 np0005470441 podman[202379]: 2025-10-04 05:25:54.805908865 +0000 UTC m=+0.023813867 image pull 50efaea89142e519f31f4c7eaa86ed42b916b0efad6daebac6b52c254ced116c quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  4 01:25:54 np0005470441 python3[202342]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Oct  4 01:25:55 np0005470441 podman[202568]: 2025-10-04 05:25:55.576087173 +0000 UTC m=+0.061467199 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct  4 01:25:55 np0005470441 python3.9[202569]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:25:56 np0005470441 python3.9[202743]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:57 np0005470441 python3.9[202894]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555556.69569-1599-218866126552793/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:25:58 np0005470441 python3.9[202970]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:25:58 np0005470441 systemd[1]: Reloading.
Oct  4 01:25:58 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:25:58 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:25:59 np0005470441 python3.9[203081]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:25:59 np0005470441 systemd[1]: Reloading.
Oct  4 01:25:59 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:25:59 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:25:59 np0005470441 systemd[1]: Starting ceilometer_agent_compute container...
Oct  4 01:25:59 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:25:59 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:25:59 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:25:59 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  4 01:25:59 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  4 01:25:59 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.
Oct  4 01:25:59 np0005470441 podman[203121]: 2025-10-04 05:25:59.63150297 +0000 UTC m=+0.148217628 container init a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + sudo -E kolla_set_configs
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: sudo: unable to send audit message: Operation not permitted
Oct  4 01:25:59 np0005470441 podman[203121]: 2025-10-04 05:25:59.661247854 +0000 UTC m=+0.177962512 container start a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Oct  4 01:25:59 np0005470441 podman[203121]: ceilometer_agent_compute
Oct  4 01:25:59 np0005470441 systemd[1]: Started ceilometer_agent_compute container.
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Validating config file
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Copying service configuration files
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: INFO:__main__:Writing out command to execute
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: ++ cat /run_command
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + ARGS=
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + sudo kolla_copy_cacerts
Oct  4 01:25:59 np0005470441 podman[203143]: 2025-10-04 05:25:59.734273 +0000 UTC m=+0.061505200 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:25:59 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-4b5770507ffc2205.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:25:59 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-4b5770507ffc2205.service: Failed with result 'exit-code'.
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: sudo: unable to send audit message: Operation not permitted
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + [[ ! -n '' ]]
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + . kolla_extend_start
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + umask 0022
Oct  4 01:25:59 np0005470441 ceilometer_agent_compute[203135]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  4 01:26:00 np0005470441 python3.9[203320]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:26:00 np0005470441 systemd[1]: Stopping ceilometer_agent_compute container...
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.676 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.677 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.677 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.677 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.677 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.678 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.678 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.678 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.678 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.678 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.679 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.680 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.681 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.682 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.683 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.684 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.685 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.686 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.687 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.688 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.689 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.690 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.691 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 podman[203322]: 2025-10-04 05:26:00.692672797 +0000 UTC m=+0.060609324 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.692 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.693 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.694 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.695 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.696 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.697 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.697 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.697 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.697 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.697 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.698 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.715 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.716 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.717 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.813 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.813 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.825 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.914 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.915 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.916 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.917 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.918 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.919 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.920 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.921 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.922 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.923 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.924 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.925 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.926 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.927 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.928 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.929 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.930 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.931 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.932 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.933 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.934 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.935 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.936 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.937 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.937 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.937 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.938 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Oct  4 01:26:00 np0005470441 ceilometer_agent_compute[203135]: 2025-10-04 05:26:00.949 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Oct  4 01:26:00 np0005470441 virtqemud[192168]: End of file while reading data: Input/output error
Oct  4 01:26:01 np0005470441 systemd[1]: libpod-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.scope: Deactivated successfully.
Oct  4 01:26:01 np0005470441 systemd[1]: libpod-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.scope: Consumed 1.529s CPU time.
Oct  4 01:26:01 np0005470441 podman[203330]: 2025-10-04 05:26:01.186684071 +0000 UTC m=+0.527799982 container died a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_id=edpm, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:26:01 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-4b5770507ffc2205.timer: Deactivated successfully.
Oct  4 01:26:01 np0005470441 systemd[1]: Stopped /usr/bin/podman healthcheck run a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.
Oct  4 01:26:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-userdata-shm.mount: Deactivated successfully.
Oct  4 01:26:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay-429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471-merged.mount: Deactivated successfully.
Oct  4 01:26:01 np0005470441 podman[203330]: 2025-10-04 05:26:01.242054926 +0000 UTC m=+0.583170786 container cleanup a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  4 01:26:01 np0005470441 podman[203330]: ceilometer_agent_compute
Oct  4 01:26:01 np0005470441 podman[203374]: ceilometer_agent_compute
Oct  4 01:26:01 np0005470441 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct  4 01:26:01 np0005470441 systemd[1]: Stopped ceilometer_agent_compute container.
Oct  4 01:26:01 np0005470441 systemd[1]: Starting ceilometer_agent_compute container...
Oct  4 01:26:01 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:01 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:01 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:01 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:01 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/429f35e62924d127f3c06f853e017a64a27ed8107188a41d7043bed4eb504471/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:01 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.
Oct  4 01:26:01 np0005470441 podman[203387]: 2025-10-04 05:26:01.455889556 +0000 UTC m=+0.133008990 container init a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + sudo -E kolla_set_configs
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: sudo: unable to send audit message: Operation not permitted
Oct  4 01:26:01 np0005470441 podman[203387]: 2025-10-04 05:26:01.489141206 +0000 UTC m=+0.166260620 container start a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  4 01:26:01 np0005470441 podman[203387]: ceilometer_agent_compute
Oct  4 01:26:01 np0005470441 systemd[1]: Started ceilometer_agent_compute container.
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Validating config file
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Copying service configuration files
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: INFO:__main__:Writing out command to execute
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: ++ cat /run_command
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + ARGS=
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + sudo kolla_copy_cacerts
Oct  4 01:26:01 np0005470441 podman[203409]: 2025-10-04 05:26:01.550759119 +0000 UTC m=+0.054104858 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: sudo: unable to send audit message: Operation not permitted
Oct  4 01:26:01 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-339e8fa5981e6b63.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:26:01 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-339e8fa5981e6b63.service: Failed with result 'exit-code'.
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + [[ ! -n '' ]]
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + . kolla_extend_start
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + umask 0022
Oct  4 01:26:01 np0005470441 ceilometer_agent_compute[203402]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  4 01:26:02 np0005470441 python3.9[203585]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.483 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.483 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.483 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.483 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.484 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.485 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.486 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.487 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.488 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.489 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.490 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.491 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.492 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.493 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.494 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.495 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.496 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.497 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.498 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.499 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.500 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.500 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.500 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.517 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.518 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.519 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.532 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.674 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.675 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.676 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.677 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.678 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.679 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.680 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.681 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.682 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.683 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.684 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.685 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.686 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.687 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.688 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.689 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.690 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.691 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.692 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.693 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.694 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.695 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.698 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.702 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.708 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:26:02.708 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:26:02 np0005470441 python3.9[203711]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555561.7099142-1695-175708875979738/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:26:03 np0005470441 python3.9[203866]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct  4 01:26:04 np0005470441 python3.9[204018]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:26:05 np0005470441 python3[204170]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:26:05 np0005470441 podman[204207]: 2025-10-04 05:26:05.680295535 +0000 UTC m=+0.021544109 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  4 01:26:05 np0005470441 podman[204207]: 2025-10-04 05:26:05.835543963 +0000 UTC m=+0.176792517 container create 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Oct  4 01:26:05 np0005470441 python3[204170]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct  4 01:26:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:26:06.730 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:26:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:26:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:26:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:26:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:26:06 np0005470441 python3.9[204397]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:26:07 np0005470441 python3.9[204551]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:26:08 np0005470441 python3.9[204702]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555567.6917107-1854-124951560153081/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:26:08 np0005470441 python3.9[204778]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:26:08 np0005470441 systemd[1]: Reloading.
Oct  4 01:26:08 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:26:08 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:26:09 np0005470441 python3.9[204889]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:26:09 np0005470441 systemd[1]: Reloading.
Oct  4 01:26:09 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:26:09 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:26:10 np0005470441 systemd[1]: Starting node_exporter container...
Oct  4 01:26:10 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:10 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/762a947d572f18b1a4e14bbccc642a245b415567de462fa8842c2e3800a66c4a/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:10 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/762a947d572f18b1a4e14bbccc642a245b415567de462fa8842c2e3800a66c4a/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:10 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.
Oct  4 01:26:10 np0005470441 podman[204930]: 2025-10-04 05:26:10.33360517 +0000 UTC m=+0.124064832 container init 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.349Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.349Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.349Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.349Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.349Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=arp
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=bcache
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=bonding
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=cpu
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=edac
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=filefd
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=netclass
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=netdev
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=netstat
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=nfs
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=nvme
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=softnet
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=systemd
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=xfs
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.350Z caller=node_exporter.go:117 level=info collector=zfs
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.351Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  4 01:26:10 np0005470441 node_exporter[204944]: ts=2025-10-04T05:26:10.352Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  4 01:26:10 np0005470441 podman[204930]: 2025-10-04 05:26:10.363968722 +0000 UTC m=+0.154428364 container start 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:26:10 np0005470441 podman[204930]: node_exporter
Oct  4 01:26:10 np0005470441 systemd[1]: Started node_exporter container.
Oct  4 01:26:10 np0005470441 podman[204953]: 2025-10-04 05:26:10.437249096 +0000 UTC m=+0.061326195 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:26:11 np0005470441 python3.9[205128]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:26:11 np0005470441 systemd[1]: Stopping node_exporter container...
Oct  4 01:26:11 np0005470441 systemd[1]: libpod-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.scope: Deactivated successfully.
Oct  4 01:26:11 np0005470441 conmon[204944]: conmon 69802beeae07c5662520 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.scope/container/memory.events
Oct  4 01:26:11 np0005470441 podman[205132]: 2025-10-04 05:26:11.550313013 +0000 UTC m=+0.055569981 container died 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:26:11 np0005470441 systemd[1]: 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e-7cd549b239ae80a3.timer: Deactivated successfully.
Oct  4 01:26:11 np0005470441 systemd[1]: Stopped /usr/bin/podman healthcheck run 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.
Oct  4 01:26:11 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e-userdata-shm.mount: Deactivated successfully.
Oct  4 01:26:11 np0005470441 systemd[1]: var-lib-containers-storage-overlay-762a947d572f18b1a4e14bbccc642a245b415567de462fa8842c2e3800a66c4a-merged.mount: Deactivated successfully.
Oct  4 01:26:11 np0005470441 podman[205132]: 2025-10-04 05:26:11.752336488 +0000 UTC m=+0.257593456 container cleanup 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:26:11 np0005470441 podman[205132]: node_exporter
Oct  4 01:26:11 np0005470441 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  4 01:26:11 np0005470441 podman[205161]: node_exporter
Oct  4 01:26:11 np0005470441 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct  4 01:26:11 np0005470441 systemd[1]: Stopped node_exporter container.
Oct  4 01:26:11 np0005470441 systemd[1]: Starting node_exporter container...
Oct  4 01:26:11 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:11 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/762a947d572f18b1a4e14bbccc642a245b415567de462fa8842c2e3800a66c4a/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:11 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/762a947d572f18b1a4e14bbccc642a245b415567de462fa8842c2e3800a66c4a/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:11 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.
Oct  4 01:26:11 np0005470441 podman[205173]: 2025-10-04 05:26:11.953139626 +0000 UTC m=+0.105677829 container init 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.967Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.967Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.967Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:117 level=info collector=arp
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:117 level=info collector=bcache
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:117 level=info collector=bonding
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.968Z caller=node_exporter.go:117 level=info collector=cpu
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=edac
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=filefd
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=netclass
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=netdev
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=netstat
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=nfs
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=nvme
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=softnet
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=systemd
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=xfs
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=node_exporter.go:117 level=info collector=zfs
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  4 01:26:11 np0005470441 node_exporter[205188]: ts=2025-10-04T05:26:11.969Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  4 01:26:11 np0005470441 podman[205173]: 2025-10-04 05:26:11.982036085 +0000 UTC m=+0.134574258 container start 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:26:11 np0005470441 podman[205173]: node_exporter
Oct  4 01:26:11 np0005470441 systemd[1]: Started node_exporter container.
Oct  4 01:26:12 np0005470441 podman[205198]: 2025-10-04 05:26:12.046656818 +0000 UTC m=+0.054037376 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:26:12 np0005470441 python3.9[205373]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:26:13 np0005470441 python3.9[205496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555572.337899-1950-239274247097294/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:26:14 np0005470441 python3.9[205648]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct  4 01:26:15 np0005470441 podman[205800]: 2025-10-04 05:26:15.234706565 +0000 UTC m=+0.079417379 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:26:15 np0005470441 python3.9[205801]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:26:16 np0005470441 python3[205973]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:26:17 np0005470441 podman[206044]: 2025-10-04 05:26:17.898613992 +0000 UTC m=+0.301351491 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:26:18 np0005470441 podman[205986]: 2025-10-04 05:26:18.519245184 +0000 UTC m=+2.014449251 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  4 01:26:18 np0005470441 podman[206108]: 2025-10-04 05:26:18.680802511 +0000 UTC m=+0.065538071 container create 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Oct  4 01:26:18 np0005470441 podman[206108]: 2025-10-04 05:26:18.635753347 +0000 UTC m=+0.020488937 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  4 01:26:18 np0005470441 python3[205973]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct  4 01:26:19 np0005470441 python3.9[206298]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:26:20 np0005470441 python3.9[206452]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:26:21 np0005470441 python3.9[206603]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555580.8822818-2109-188646901502370/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:26:22 np0005470441 python3.9[206679]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:26:22 np0005470441 systemd[1]: Reloading.
Oct  4 01:26:22 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:26:22 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:26:23 np0005470441 python3.9[206791]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:26:23 np0005470441 systemd[1]: Reloading.
Oct  4 01:26:23 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:26:23 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:26:23 np0005470441 systemd[1]: Starting podman_exporter container...
Oct  4 01:26:23 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:23 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a62ac1e41f1fe34d85472c693912d3f39c15bcc39769839b3d8eff2eb3bbcf31/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:23 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a62ac1e41f1fe34d85472c693912d3f39c15bcc39769839b3d8eff2eb3bbcf31/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:23 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.
Oct  4 01:26:23 np0005470441 podman[206831]: 2025-10-04 05:26:23.901041813 +0000 UTC m=+0.476215320 container init 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:26:23 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:23.916Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  4 01:26:23 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:23.916Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  4 01:26:23 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:23.916Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  4 01:26:23 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:23.916Z caller=handler.go:105 level=info collector=container
Oct  4 01:26:23 np0005470441 podman[206831]: 2025-10-04 05:26:23.921012634 +0000 UTC m=+0.496186121 container start 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:26:23 np0005470441 systemd[1]: Starting Podman API Service...
Oct  4 01:26:23 np0005470441 systemd[1]: Started Podman API Service.
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="Setting parallel job count to 25"
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="Using sqlite as database backend"
Oct  4 01:26:23 np0005470441 podman[206831]: podman_exporter
Oct  4 01:26:23 np0005470441 systemd[1]: Started podman_exporter container.
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct  4 01:26:23 np0005470441 podman[206857]: @ - - [04/Oct/2025:05:26:23 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  4 01:26:23 np0005470441 podman[206857]: time="2025-10-04T05:26:23Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  4 01:26:24 np0005470441 podman[206855]: 2025-10-04 05:26:24.006842884 +0000 UTC m=+0.075659966 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:26:24 np0005470441 podman[206857]: @ - - [04/Oct/2025:05:26:23 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22060 "" "Go-http-client/1.1"
Oct  4 01:26:24 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:24.007Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  4 01:26:24 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:24.008Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  4 01:26:24 np0005470441 podman_exporter[206846]: ts=2025-10-04T05:26:24.009Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  4 01:26:24 np0005470441 systemd[1]: 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112-71168501779448bf.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:26:24 np0005470441 systemd[1]: 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112-71168501779448bf.service: Failed with result 'exit-code'.
Oct  4 01:26:24 np0005470441 python3.9[207044]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:26:24 np0005470441 systemd[1]: Stopping podman_exporter container...
Oct  4 01:26:25 np0005470441 podman[206857]: @ - - [04/Oct/2025:05:26:23 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Oct  4 01:26:25 np0005470441 systemd[1]: libpod-46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.scope: Deactivated successfully.
Oct  4 01:26:25 np0005470441 podman[207048]: 2025-10-04 05:26:25.029060599 +0000 UTC m=+0.099602355 container died 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:26:25 np0005470441 systemd[1]: 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112-71168501779448bf.timer: Deactivated successfully.
Oct  4 01:26:25 np0005470441 systemd[1]: Stopped /usr/bin/podman healthcheck run 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.
Oct  4 01:26:25 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112-userdata-shm.mount: Deactivated successfully.
Oct  4 01:26:25 np0005470441 systemd[1]: var-lib-containers-storage-overlay-a62ac1e41f1fe34d85472c693912d3f39c15bcc39769839b3d8eff2eb3bbcf31-merged.mount: Deactivated successfully.
Oct  4 01:26:25 np0005470441 podman[207048]: 2025-10-04 05:26:25.873906862 +0000 UTC m=+0.944448618 container cleanup 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:26:25 np0005470441 podman[207048]: podman_exporter
Oct  4 01:26:25 np0005470441 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  4 01:26:25 np0005470441 podman[207076]: podman_exporter
Oct  4 01:26:25 np0005470441 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct  4 01:26:25 np0005470441 systemd[1]: Stopped podman_exporter container.
Oct  4 01:26:25 np0005470441 systemd[1]: Starting podman_exporter container...
Oct  4 01:26:25 np0005470441 podman[207075]: 2025-10-04 05:26:25.975441235 +0000 UTC m=+0.064418188 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  4 01:26:26 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:26 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a62ac1e41f1fe34d85472c693912d3f39c15bcc39769839b3d8eff2eb3bbcf31/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:26 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a62ac1e41f1fe34d85472c693912d3f39c15bcc39769839b3d8eff2eb3bbcf31/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:26 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.
Oct  4 01:26:26 np0005470441 podman[207101]: 2025-10-04 05:26:26.178547212 +0000 UTC m=+0.197208861 container init 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.192Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.193Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.193Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.193Z caller=handler.go:105 level=info collector=container
Oct  4 01:26:26 np0005470441 podman[206857]: @ - - [04/Oct/2025:05:26:26 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  4 01:26:26 np0005470441 podman[206857]: time="2025-10-04T05:26:26Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  4 01:26:26 np0005470441 podman[207101]: 2025-10-04 05:26:26.204970436 +0000 UTC m=+0.223632055 container start 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:26:26 np0005470441 podman[207101]: podman_exporter
Oct  4 01:26:26 np0005470441 systemd[1]: Started podman_exporter container.
Oct  4 01:26:26 np0005470441 podman[206857]: @ - - [04/Oct/2025:05:26:26 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22062 "" "Go-http-client/1.1"
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.274Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.275Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  4 01:26:26 np0005470441 podman_exporter[207120]: ts=2025-10-04T05:26:26.275Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  4 01:26:26 np0005470441 podman[207131]: 2025-10-04 05:26:26.279658232 +0000 UTC m=+0.066778509 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:26:26 np0005470441 systemd[1]: 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112-63250bb7e2ca77a6.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:26:26 np0005470441 systemd[1]: 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112-63250bb7e2ca77a6.service: Failed with result 'exit-code'.
Oct  4 01:26:26 np0005470441 python3.9[207306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:26:27 np0005470441 python3.9[207429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759555586.477418-2205-206593535401511/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  4 01:26:28 np0005470441 python3.9[207581]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct  4 01:26:29 np0005470441 python3.9[207733]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  4 01:26:30 np0005470441 python3[207885]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  4 01:26:31 np0005470441 podman[207916]: 2025-10-04 05:26:31.299416227 +0000 UTC m=+0.051879811 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:26:32 np0005470441 podman[207962]: 2025-10-04 05:26:32.94847776 +0000 UTC m=+0.829258756 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:26:33 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-339e8fa5981e6b63.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:26:33 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-339e8fa5981e6b63.service: Failed with result 'exit-code'.
Oct  4 01:26:33 np0005470441 podman[207898]: 2025-10-04 05:26:33.201047804 +0000 UTC m=+2.598379899 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  4 01:26:33 np0005470441 podman[208034]: 2025-10-04 05:26:33.337401244 +0000 UTC m=+0.048926782 container create 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  4 01:26:33 np0005470441 podman[208034]: 2025-10-04 05:26:33.309148554 +0000 UTC m=+0.020674122 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  4 01:26:33 np0005470441 python3[207885]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  4 01:26:34 np0005470441 python3.9[208225]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:26:35 np0005470441 python3.9[208379]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:26:36 np0005470441 python3.9[208530]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759555595.2043858-2364-203148753756131/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:26:36 np0005470441 python3.9[208606]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  4 01:26:36 np0005470441 systemd[1]: Reloading.
Oct  4 01:26:36 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:26:36 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:26:37 np0005470441 python3.9[208717]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  4 01:26:37 np0005470441 systemd[1]: Reloading.
Oct  4 01:26:37 np0005470441 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  4 01:26:37 np0005470441 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  4 01:26:37 np0005470441 systemd[1]: Starting openstack_network_exporter container...
Oct  4 01:26:38 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:38 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:38 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:38 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:38 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.
Oct  4 01:26:38 np0005470441 podman[208757]: 2025-10-04 05:26:38.409022857 +0000 UTC m=+0.494724636 container init 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=)
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *bridge.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *coverage.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *datapath.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *iface.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *memory.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *ovnnorthd.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *ovn.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *ovsdbserver.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *pmd_perf.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *pmd_rxq.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: INFO    05:26:38 main.go:48: registering *vswitch.Collector
Oct  4 01:26:38 np0005470441 openstack_network_exporter[208772]: NOTICE  05:26:38 main.go:76: listening on https://:9105/metrics
Oct  4 01:26:38 np0005470441 podman[208757]: 2025-10-04 05:26:38.441897506 +0000 UTC m=+0.527599285 container start 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, vcs-type=git, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:26:38 np0005470441 podman[208757]: openstack_network_exporter
Oct  4 01:26:38 np0005470441 systemd[1]: Started openstack_network_exporter container.
Oct  4 01:26:38 np0005470441 podman[208782]: 2025-10-04 05:26:38.61595878 +0000 UTC m=+0.167725805 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Oct  4 01:26:39 np0005470441 python3.9[208954]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  4 01:26:39 np0005470441 systemd[1]: Stopping openstack_network_exporter container...
Oct  4 01:26:39 np0005470441 systemd[1]: libpod-0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.scope: Deactivated successfully.
Oct  4 01:26:39 np0005470441 podman[208958]: 2025-10-04 05:26:39.700133787 +0000 UTC m=+0.054052126 container died 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  4 01:26:39 np0005470441 systemd[1]: 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6-7cfd567c66d33e4e.timer: Deactivated successfully.
Oct  4 01:26:39 np0005470441 systemd[1]: Stopped /usr/bin/podman healthcheck run 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.
Oct  4 01:26:39 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6-userdata-shm.mount: Deactivated successfully.
Oct  4 01:26:39 np0005470441 systemd[1]: var-lib-containers-storage-overlay-2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d-merged.mount: Deactivated successfully.
Oct  4 01:26:40 np0005470441 podman[208958]: 2025-10-04 05:26:40.618274684 +0000 UTC m=+0.972193053 container cleanup 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Oct  4 01:26:40 np0005470441 podman[208958]: openstack_network_exporter
Oct  4 01:26:40 np0005470441 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  4 01:26:40 np0005470441 podman[208985]: openstack_network_exporter
Oct  4 01:26:40 np0005470441 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct  4 01:26:40 np0005470441 systemd[1]: Stopped openstack_network_exporter container.
Oct  4 01:26:40 np0005470441 systemd[1]: Starting openstack_network_exporter container...
Oct  4 01:26:40 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:26:40 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:40 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:40 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2851a057a6055adf4e5110142bbbeb2bb980e62975f4e2504ee9b5a3ca138a6d/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  4 01:26:40 np0005470441 systemd[1]: Started /usr/bin/podman healthcheck run 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.
Oct  4 01:26:40 np0005470441 podman[208998]: 2025-10-04 05:26:40.827613768 +0000 UTC m=+0.126541095 container init 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *bridge.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *coverage.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *datapath.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *iface.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *memory.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *ovnnorthd.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *ovn.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *ovsdbserver.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *pmd_perf.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *pmd_rxq.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: INFO    05:26:40 main.go:48: registering *vswitch.Collector
Oct  4 01:26:40 np0005470441 openstack_network_exporter[209013]: NOTICE  05:26:40 main.go:76: listening on https://:9105/metrics
Oct  4 01:26:40 np0005470441 podman[208998]: 2025-10-04 05:26:40.860210259 +0000 UTC m=+0.159137576 container start 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Oct  4 01:26:40 np0005470441 podman[208998]: openstack_network_exporter
Oct  4 01:26:40 np0005470441 systemd[1]: Started openstack_network_exporter container.
Oct  4 01:26:40 np0005470441 podman[209024]: 2025-10-04 05:26:40.929621806 +0000 UTC m=+0.058855021 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Oct  4 01:26:42 np0005470441 python3.9[209196]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  4 01:26:42 np0005470441 podman[209197]: 2025-10-04 05:26:42.286497144 +0000 UTC m=+0.061689426 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:26:46 np0005470441 podman[209245]: 2025-10-04 05:26:46.290497635 +0000 UTC m=+0.046536931 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  4 01:26:48 np0005470441 podman[209265]: 2025-10-04 05:26:48.370176566 +0000 UTC m=+0.124813913 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.162 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.179 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.179 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.179 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.179 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.202 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.202 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.203 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.203 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.360 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.361 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6016MB free_disk=73.4994888305664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.362 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.362 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.430 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.431 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.451 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.466 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.467 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:26:50 np0005470441 nova_compute[192626]: 2025-10-04 05:26:50.467 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.005 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.005 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.005 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.005 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.022 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:26:51 np0005470441 nova_compute[192626]: 2025-10-04 05:26:51.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:26:56 np0005470441 podman[209288]: 2025-10-04 05:26:56.307471753 +0000 UTC m=+0.060432398 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  4 01:26:56 np0005470441 podman[209308]: 2025-10-04 05:26:56.379742206 +0000 UTC m=+0.047808918 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:27:02 np0005470441 podman[209331]: 2025-10-04 05:27:02.308878652 +0000 UTC m=+0.049179820 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.3)
Oct  4 01:27:03 np0005470441 podman[209351]: 2025-10-04 05:27:03.304380964 +0000 UTC m=+0.049648073 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:27:03 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-339e8fa5981e6b63.service: Main process exited, code=exited, status=1/FAILURE
Oct  4 01:27:03 np0005470441 systemd[1]: a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26-339e8fa5981e6b63.service: Failed with result 'exit-code'.
Oct  4 01:27:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:27:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:27:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:27:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:27:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:27:06.731 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:27:08 np0005470441 python3.9[209497]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct  4 01:27:09 np0005470441 python3.9[209662]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:09 np0005470441 systemd[1]: Started libpod-conmon-9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15.scope.
Oct  4 01:27:09 np0005470441 podman[209663]: 2025-10-04 05:27:09.599379491 +0000 UTC m=+0.081199392 container exec 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:27:09 np0005470441 podman[209663]: 2025-10-04 05:27:09.610000131 +0000 UTC m=+0.091820042 container exec_died 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:27:09 np0005470441 systemd[1]: libpod-conmon-9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15.scope: Deactivated successfully.
Oct  4 01:27:10 np0005470441 python3.9[209846]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:10 np0005470441 systemd[1]: Started libpod-conmon-9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15.scope.
Oct  4 01:27:10 np0005470441 podman[209847]: 2025-10-04 05:27:10.35662865 +0000 UTC m=+0.065127269 container exec 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  4 01:27:10 np0005470441 podman[209864]: 2025-10-04 05:27:10.426830271 +0000 UTC m=+0.055107588 container exec_died 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:27:10 np0005470441 podman[209847]: 2025-10-04 05:27:10.452013478 +0000 UTC m=+0.160512127 container exec_died 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  4 01:27:10 np0005470441 systemd[1]: libpod-conmon-9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15.scope: Deactivated successfully.
Oct  4 01:27:11 np0005470441 podman[210028]: 2025-10-04 05:27:11.049664718 +0000 UTC m=+0.067840010 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, vendor=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  4 01:27:11 np0005470441 python3.9[210029]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:11 np0005470441 python3.9[210203]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct  4 01:27:12 np0005470441 podman[210341]: 2025-10-04 05:27:12.481731007 +0000 UTC m=+0.059672615 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:27:12 np0005470441 python3.9[210391]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:12 np0005470441 systemd[1]: Started libpod-conmon-60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3.scope.
Oct  4 01:27:12 np0005470441 podman[210392]: 2025-10-04 05:27:12.870881818 +0000 UTC m=+0.155014512 container exec 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:27:12 np0005470441 podman[210411]: 2025-10-04 05:27:12.941800441 +0000 UTC m=+0.056843010 container exec_died 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  4 01:27:12 np0005470441 podman[210392]: 2025-10-04 05:27:12.965504304 +0000 UTC m=+0.249636998 container exec_died 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:27:12 np0005470441 systemd[1]: libpod-conmon-60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3.scope: Deactivated successfully.
Oct  4 01:27:13 np0005470441 python3.9[210574]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:13 np0005470441 systemd[1]: Started libpod-conmon-60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3.scope.
Oct  4 01:27:13 np0005470441 podman[210575]: 2025-10-04 05:27:13.873565307 +0000 UTC m=+0.156447565 container exec 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  4 01:27:13 np0005470441 podman[210592]: 2025-10-04 05:27:13.959813441 +0000 UTC m=+0.069834021 container exec_died 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:27:13 np0005470441 podman[210575]: 2025-10-04 05:27:13.993836614 +0000 UTC m=+0.276718872 container exec_died 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:27:13 np0005470441 systemd[1]: libpod-conmon-60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3.scope: Deactivated successfully.
Oct  4 01:27:14 np0005470441 python3.9[210758]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:15 np0005470441 python3.9[210910]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct  4 01:27:16 np0005470441 python3.9[211075]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:16 np0005470441 systemd[1]: Started libpod-conmon-9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa.scope.
Oct  4 01:27:16 np0005470441 podman[211076]: 2025-10-04 05:27:16.511559646 +0000 UTC m=+0.095515303 container exec 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Oct  4 01:27:16 np0005470441 podman[211076]: 2025-10-04 05:27:16.542101124 +0000 UTC m=+0.126056761 container exec_died 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:27:16 np0005470441 systemd[1]: libpod-conmon-9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa.scope: Deactivated successfully.
Oct  4 01:27:16 np0005470441 podman[211092]: 2025-10-04 05:27:16.663773033 +0000 UTC m=+0.144650701 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:27:17 np0005470441 python3.9[211274]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:17 np0005470441 systemd[1]: Started libpod-conmon-9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa.scope.
Oct  4 01:27:17 np0005470441 podman[211275]: 2025-10-04 05:27:17.380624387 +0000 UTC m=+0.067708237 container exec 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:27:17 np0005470441 podman[211295]: 2025-10-04 05:27:17.444767286 +0000 UTC m=+0.051322324 container exec_died 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  4 01:27:17 np0005470441 podman[211275]: 2025-10-04 05:27:17.452085836 +0000 UTC m=+0.139169676 container exec_died 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:27:17 np0005470441 systemd[1]: libpod-conmon-9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa.scope: Deactivated successfully.
Oct  4 01:27:18 np0005470441 python3.9[211459]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:18 np0005470441 podman[211583]: 2025-10-04 05:27:18.778014403 +0000 UTC m=+0.081939595 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:27:18 np0005470441 python3.9[211632]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct  4 01:27:19 np0005470441 python3.9[211802]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:19 np0005470441 systemd[1]: Started libpod-conmon-3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.scope.
Oct  4 01:27:19 np0005470441 podman[211803]: 2025-10-04 05:27:19.784078403 +0000 UTC m=+0.083492021 container exec 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:27:19 np0005470441 podman[211803]: 2025-10-04 05:27:19.81888372 +0000 UTC m=+0.118297308 container exec_died 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:27:19 np0005470441 systemd[1]: libpod-conmon-3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.scope: Deactivated successfully.
Oct  4 01:27:20 np0005470441 python3.9[211986]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:20 np0005470441 systemd[1]: Started libpod-conmon-3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.scope.
Oct  4 01:27:20 np0005470441 podman[211987]: 2025-10-04 05:27:20.552911071 +0000 UTC m=+0.067862762 container exec 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:27:20 np0005470441 podman[211987]: 2025-10-04 05:27:20.587957654 +0000 UTC m=+0.102909285 container exec_died 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:27:20 np0005470441 systemd[1]: libpod-conmon-3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8.scope: Deactivated successfully.
Oct  4 01:27:21 np0005470441 python3.9[212170]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:21 np0005470441 python3.9[212322]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct  4 01:27:22 np0005470441 python3.9[212487]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:22 np0005470441 systemd[1]: Started libpod-conmon-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.scope.
Oct  4 01:27:22 np0005470441 podman[212488]: 2025-10-04 05:27:22.744435855 +0000 UTC m=+0.069854792 container exec a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:27:22 np0005470441 podman[212508]: 2025-10-04 05:27:22.806714008 +0000 UTC m=+0.048970204 container exec_died a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:27:22 np0005470441 podman[212488]: 2025-10-04 05:27:22.811260474 +0000 UTC m=+0.136679411 container exec_died a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:27:22 np0005470441 systemd[1]: libpod-conmon-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.scope: Deactivated successfully.
Oct  4 01:27:23 np0005470441 python3.9[212670]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:23 np0005470441 systemd[1]: Started libpod-conmon-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.scope.
Oct  4 01:27:23 np0005470441 podman[212671]: 2025-10-04 05:27:23.497544829 +0000 UTC m=+0.068570792 container exec a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:27:23 np0005470441 podman[212691]: 2025-10-04 05:27:23.562765551 +0000 UTC m=+0.052921393 container exec_died a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:27:23 np0005470441 podman[212671]: 2025-10-04 05:27:23.568092991 +0000 UTC m=+0.139118954 container exec_died a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:27:23 np0005470441 systemd[1]: libpod-conmon-a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26.scope: Deactivated successfully.
Oct  4 01:27:24 np0005470441 python3.9[212855]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:24 np0005470441 python3.9[213007]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct  4 01:27:25 np0005470441 python3.9[213172]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:25 np0005470441 systemd[1]: Started libpod-conmon-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.scope.
Oct  4 01:27:25 np0005470441 podman[213173]: 2025-10-04 05:27:25.795947787 +0000 UTC m=+0.072073368 container exec 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:27:25 np0005470441 podman[213173]: 2025-10-04 05:27:25.826869037 +0000 UTC m=+0.102994588 container exec_died 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:27:25 np0005470441 systemd[1]: libpod-conmon-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.scope: Deactivated successfully.
Oct  4 01:27:26 np0005470441 podman[213327]: 2025-10-04 05:27:26.451669673 +0000 UTC m=+0.068709937 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:27:26 np0005470441 podman[213373]: 2025-10-04 05:27:26.507075199 +0000 UTC m=+0.048352815 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:27:26 np0005470441 python3.9[213375]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:26 np0005470441 systemd[1]: Started libpod-conmon-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.scope.
Oct  4 01:27:26 np0005470441 podman[213397]: 2025-10-04 05:27:26.831472073 +0000 UTC m=+0.143345721 container exec 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:27:26 np0005470441 podman[213416]: 2025-10-04 05:27:26.956767121 +0000 UTC m=+0.111763772 container exec_died 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:27:26 np0005470441 podman[213397]: 2025-10-04 05:27:26.978399401 +0000 UTC m=+0.290273029 container exec_died 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:27:26 np0005470441 systemd[1]: libpod-conmon-69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e.scope: Deactivated successfully.
Oct  4 01:27:27 np0005470441 python3.9[213580]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:28 np0005470441 python3.9[213732]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct  4 01:27:29 np0005470441 python3.9[213897]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:29 np0005470441 systemd[1]: Started libpod-conmon-46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.scope.
Oct  4 01:27:29 np0005470441 podman[213898]: 2025-10-04 05:27:29.429190471 +0000 UTC m=+0.106124602 container exec 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:27:29 np0005470441 podman[213898]: 2025-10-04 05:27:29.461334917 +0000 UTC m=+0.138269048 container exec_died 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:27:29 np0005470441 systemd[1]: libpod-conmon-46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.scope: Deactivated successfully.
Oct  4 01:27:30 np0005470441 python3.9[214081]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:30 np0005470441 systemd[1]: Started libpod-conmon-46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.scope.
Oct  4 01:27:30 np0005470441 podman[214082]: 2025-10-04 05:27:30.235063212 +0000 UTC m=+0.066403988 container exec 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:27:30 np0005470441 podman[214082]: 2025-10-04 05:27:30.268735894 +0000 UTC m=+0.100076670 container exec_died 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:27:30 np0005470441 systemd[1]: libpod-conmon-46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112.scope: Deactivated successfully.
Oct  4 01:27:30 np0005470441 python3.9[214266]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:31 np0005470441 python3.9[214418]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct  4 01:27:32 np0005470441 python3.9[214583]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:32 np0005470441 systemd[1]: Started libpod-conmon-0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.scope.
Oct  4 01:27:32 np0005470441 podman[214584]: 2025-10-04 05:27:32.512369025 +0000 UTC m=+0.110992928 container exec 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:27:32 np0005470441 podman[214584]: 2025-10-04 05:27:32.640017273 +0000 UTC m=+0.238641156 container exec_died 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:27:32 np0005470441 systemd[1]: libpod-conmon-0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.scope: Deactivated successfully.
Oct  4 01:27:32 np0005470441 podman[214601]: 2025-10-04 05:27:32.853710728 +0000 UTC m=+0.332679734 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  4 01:27:33 np0005470441 podman[214788]: 2025-10-04 05:27:33.409968525 +0000 UTC m=+0.064080578 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:27:33 np0005470441 python3.9[214789]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  4 01:27:33 np0005470441 systemd[1]: Started libpod-conmon-0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.scope.
Oct  4 01:27:33 np0005470441 podman[214809]: 2025-10-04 05:27:33.665178869 +0000 UTC m=+0.074325046 container exec 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc.)
Oct  4 01:27:33 np0005470441 podman[214809]: 2025-10-04 05:27:33.698208382 +0000 UTC m=+0.107354539 container exec_died 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:27:33 np0005470441 systemd[1]: libpod-conmon-0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6.scope: Deactivated successfully.
Oct  4 01:27:34 np0005470441 python3.9[214992]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:27:41 np0005470441 podman[215017]: 2025-10-04 05:27:41.312306871 +0000 UTC m=+0.067635514 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct  4 01:27:43 np0005470441 podman[215038]: 2025-10-04 05:27:43.302956235 +0000 UTC m=+0.058049206 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:27:47 np0005470441 podman[215062]: 2025-10-04 05:27:47.287990987 +0000 UTC m=+0.045632953 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:27:49 np0005470441 podman[215082]: 2025-10-04 05:27:49.349315607 +0000 UTC m=+0.092145262 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 01:27:49 np0005470441 nova_compute[192626]: 2025-10-04 05:27:49.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.731 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.731 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.754 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.754 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.754 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.755 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.888 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.889 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6060MB free_disk=73.49849319458008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.889 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.889 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.949 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.949 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.968 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.980 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.982 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:27:50 np0005470441 nova_compute[192626]: 2025-10-04 05:27:50.982 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:27:51 np0005470441 nova_compute[192626]: 2025-10-04 05:27:51.967 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:51 np0005470441 nova_compute[192626]: 2025-10-04 05:27:51.968 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:51 np0005470441 nova_compute[192626]: 2025-10-04 05:27:51.968 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:51 np0005470441 nova_compute[192626]: 2025-10-04 05:27:51.968 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:51 np0005470441 nova_compute[192626]: 2025-10-04 05:27:51.969 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:51 np0005470441 nova_compute[192626]: 2025-10-04 05:27:51.969 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:27:53 np0005470441 nova_compute[192626]: 2025-10-04 05:27:53.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:27:57 np0005470441 podman[215110]: 2025-10-04 05:27:57.301743969 +0000 UTC m=+0.054385446 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3)
Oct  4 01:27:57 np0005470441 podman[215111]: 2025-10-04 05:27:57.31838627 +0000 UTC m=+0.067304455 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:28:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:28:02 np0005470441 podman[215249]: 2025-10-04 05:28:02.97231535 +0000 UTC m=+0.062041626 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:28:03 np0005470441 python3.9[215294]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:03 np0005470441 podman[215421]: 2025-10-04 05:28:03.734552499 +0000 UTC m=+0.064479039 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:28:03 np0005470441 python3.9[215466]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:04 np0005470441 python3.9[215589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759555683.426713-3306-161444502216633/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:05 np0005470441 python3.9[215741]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:06 np0005470441 python3.9[215893]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:06 np0005470441 python3.9[215971]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:28:06.732 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:28:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:28:06.732 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:28:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:28:06.732 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:28:07 np0005470441 python3.9[216123]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:08 np0005470441 python3.9[216201]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rv82dhtm recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:08 np0005470441 python3.9[216353]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:09 np0005470441 python3.9[216431]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:10 np0005470441 python3.9[216583]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:28:11 np0005470441 python3[216736]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  4 01:28:12 np0005470441 podman[216860]: 2025-10-04 05:28:12.025324975 +0000 UTC m=+0.083543273 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1755695350, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  4 01:28:12 np0005470441 python3.9[216908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:12 np0005470441 python3.9[216989]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:13 np0005470441 podman[217113]: 2025-10-04 05:28:13.585381343 +0000 UTC m=+0.048926782 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:28:13 np0005470441 python3.9[217165]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:14 np0005470441 python3.9[217243]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:14 np0005470441 python3.9[217395]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:15 np0005470441 python3.9[217473]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:16 np0005470441 python3.9[217625]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:16 np0005470441 python3.9[217703]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:17 np0005470441 podman[217827]: 2025-10-04 05:28:17.843358121 +0000 UTC m=+0.045255802 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:28:18 np0005470441 python3.9[217874]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  4 01:28:18 np0005470441 python3.9[217999]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759555697.396659-3681-131273635589246/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:19 np0005470441 python3.9[218151]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:20 np0005470441 podman[218275]: 2025-10-04 05:28:20.117066466 +0000 UTC m=+0.100805342 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Oct  4 01:28:20 np0005470441 python3.9[218323]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:28:20 np0005470441 systemd[1]: packagekit.service: Deactivated successfully.
Oct  4 01:28:21 np0005470441 python3.9[218484]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:22 np0005470441 python3.9[218636]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:28:23 np0005470441 python3.9[218789]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  4 01:28:23 np0005470441 python3.9[218943]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  4 01:28:24 np0005470441 python3.9[219098]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  4 01:28:25 np0005470441 systemd[1]: session-29.scope: Deactivated successfully.
Oct  4 01:28:25 np0005470441 systemd[1]: session-29.scope: Consumed 1min 43.491s CPU time.
Oct  4 01:28:25 np0005470441 systemd-logind[796]: Session 29 logged out. Waiting for processes to exit.
Oct  4 01:28:25 np0005470441 systemd-logind[796]: Removed session 29.
Oct  4 01:28:28 np0005470441 podman[219123]: 2025-10-04 05:28:28.294290237 +0000 UTC m=+0.050393476 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:28:28 np0005470441 podman[219124]: 2025-10-04 05:28:28.304262457 +0000 UTC m=+0.054347905 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:28:33 np0005470441 podman[219167]: 2025-10-04 05:28:33.323579586 +0000 UTC m=+0.076486590 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:28:34 np0005470441 podman[219187]: 2025-10-04 05:28:34.299464009 +0000 UTC m=+0.053154589 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:28:42 np0005470441 podman[219208]: 2025-10-04 05:28:42.295428841 +0000 UTC m=+0.053504599 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:28:44 np0005470441 podman[219228]: 2025-10-04 05:28:44.294609771 +0000 UTC m=+0.052329535 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:28:48 np0005470441 podman[219252]: 2025-10-04 05:28:48.306594172 +0000 UTC m=+0.060684975 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  4 01:28:49 np0005470441 nova_compute[192626]: 2025-10-04 05:28:49.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:50 np0005470441 podman[219271]: 2025-10-04 05:28:50.372440788 +0000 UTC m=+0.126298538 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.745 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.884 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.885 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6097MB free_disk=73.50236129760742GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.885 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.885 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.945 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.945 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.967 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.980 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.981 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:28:50 np0005470441 nova_compute[192626]: 2025-10-04 05:28:50.981 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:28:51 np0005470441 nova_compute[192626]: 2025-10-04 05:28:51.982 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:51 np0005470441 nova_compute[192626]: 2025-10-04 05:28:51.982 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:28:51 np0005470441 nova_compute[192626]: 2025-10-04 05:28:51.982 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:28:52 np0005470441 nova_compute[192626]: 2025-10-04 05:28:52.010 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:28:52 np0005470441 nova_compute[192626]: 2025-10-04 05:28:52.011 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:52 np0005470441 nova_compute[192626]: 2025-10-04 05:28:52.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:52 np0005470441 nova_compute[192626]: 2025-10-04 05:28:52.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:52 np0005470441 nova_compute[192626]: 2025-10-04 05:28:52.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:28:53 np0005470441 nova_compute[192626]: 2025-10-04 05:28:53.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:53 np0005470441 nova_compute[192626]: 2025-10-04 05:28:53.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:53 np0005470441 nova_compute[192626]: 2025-10-04 05:28:53.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:28:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:28:57.306 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:28:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:28:57.307 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:28:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:28:57.308 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:28:59 np0005470441 podman[219297]: 2025-10-04 05:28:59.296552557 +0000 UTC m=+0.050987094 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:28:59 np0005470441 podman[219298]: 2025-10-04 05:28:59.302713233 +0000 UTC m=+0.050064917 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:29:04 np0005470441 podman[219341]: 2025-10-04 05:29:04.301460814 +0000 UTC m=+0.055181280 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  4 01:29:04 np0005470441 podman[219362]: 2025-10-04 05:29:04.389270834 +0000 UTC m=+0.058020245 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:29:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:29:06.733 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:29:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:29:06.733 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:29:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:29:06.734 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:29:13 np0005470441 podman[219382]: 2025-10-04 05:29:13.325433483 +0000 UTC m=+0.083586044 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  4 01:29:15 np0005470441 podman[219403]: 2025-10-04 05:29:15.289737115 +0000 UTC m=+0.047235341 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:29:19 np0005470441 podman[219429]: 2025-10-04 05:29:19.291917073 +0000 UTC m=+0.045474608 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:29:21 np0005470441 podman[219445]: 2025-10-04 05:29:21.32254936 +0000 UTC m=+0.081894174 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:29:30 np0005470441 podman[219471]: 2025-10-04 05:29:30.293462986 +0000 UTC m=+0.048806358 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:29:30 np0005470441 podman[219472]: 2025-10-04 05:29:30.305375814 +0000 UTC m=+0.051432007 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:29:35 np0005470441 podman[219517]: 2025-10-04 05:29:35.304225468 +0000 UTC m=+0.058828020 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:29:35 np0005470441 podman[219518]: 2025-10-04 05:29:35.343345184 +0000 UTC m=+0.089933435 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:29:44 np0005470441 podman[219555]: 2025-10-04 05:29:44.31093253 +0000 UTC m=+0.054723637 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=)
Oct  4 01:29:46 np0005470441 podman[219577]: 2025-10-04 05:29:46.342764242 +0000 UTC m=+0.075830711 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:29:49 np0005470441 nova_compute[192626]: 2025-10-04 05:29:49.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:49 np0005470441 nova_compute[192626]: 2025-10-04 05:29:49.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 01:29:49 np0005470441 nova_compute[192626]: 2025-10-04 05:29:49.784 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 01:29:49 np0005470441 nova_compute[192626]: 2025-10-04 05:29:49.785 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:49 np0005470441 nova_compute[192626]: 2025-10-04 05:29:49.785 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 01:29:49 np0005470441 nova_compute[192626]: 2025-10-04 05:29:49.805 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:50 np0005470441 podman[219602]: 2025-10-04 05:29:50.294375039 +0000 UTC m=+0.046699885 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:29:51 np0005470441 nova_compute[192626]: 2025-10-04 05:29:51.846 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.186 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.187 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.187 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.187 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:29:52 np0005470441 podman[219621]: 2025-10-04 05:29:52.332433109 +0000 UTC m=+0.081699967 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller)
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.343 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.344 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6119MB free_disk=73.50175094604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.345 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.345 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.495 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.495 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.549 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.640 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.640 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.657 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.681 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.708 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.726 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.728 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:29:52 np0005470441 nova_compute[192626]: 2025-10-04 05:29:52.728 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.601 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.601 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.602 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.602 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.713 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:29:53 np0005470441 nova_compute[192626]: 2025-10-04 05:29:53.751 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:29:54 np0005470441 nova_compute[192626]: 2025-10-04 05:29:54.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:54 np0005470441 nova_compute[192626]: 2025-10-04 05:29:54.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:29:55 np0005470441 nova_compute[192626]: 2025-10-04 05:29:55.719 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:01 np0005470441 podman[219648]: 2025-10-04 05:30:01.297330816 +0000 UTC m=+0.051453368 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:30:01 np0005470441 podman[219647]: 2025-10-04 05:30:01.302527902 +0000 UTC m=+0.060044086 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.705 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.706 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.707 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.708 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.708 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:30:02.708 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:30:06 np0005470441 podman[219693]: 2025-10-04 05:30:06.300426948 +0000 UTC m=+0.056927363 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:30:06 np0005470441 podman[219694]: 2025-10-04 05:30:06.306352386 +0000 UTC m=+0.058336905 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:30:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:30:06.734 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:30:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:30:06.735 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:30:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:30:06.736 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:30:15 np0005470441 podman[219733]: 2025-10-04 05:30:15.306640354 +0000 UTC m=+0.058158960 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Oct  4 01:30:17 np0005470441 podman[219755]: 2025-10-04 05:30:17.305139834 +0000 UTC m=+0.059240152 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:30:21 np0005470441 podman[219780]: 2025-10-04 05:30:21.298409283 +0000 UTC m=+0.057368326 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:30:23 np0005470441 podman[219799]: 2025-10-04 05:30:23.326294477 +0000 UTC m=+0.077522642 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:30:32 np0005470441 podman[219828]: 2025-10-04 05:30:32.307292795 +0000 UTC m=+0.060343806 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:30:32 np0005470441 podman[219827]: 2025-10-04 05:30:32.308946455 +0000 UTC m=+0.067285015 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  4 01:30:37 np0005470441 podman[219871]: 2025-10-04 05:30:37.311590104 +0000 UTC m=+0.057236832 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:30:37 np0005470441 podman[219870]: 2025-10-04 05:30:37.316648966 +0000 UTC m=+0.066010116 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  4 01:30:46 np0005470441 podman[219912]: 2025-10-04 05:30:46.316642785 +0000 UTC m=+0.065335695 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, release=1755695350, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:30:48 np0005470441 podman[219931]: 2025-10-04 05:30:48.323420565 +0000 UTC m=+0.079295765 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:30:49 np0005470441 nova_compute[192626]: 2025-10-04 05:30:49.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:52 np0005470441 podman[219955]: 2025-10-04 05:30:52.323600822 +0000 UTC m=+0.081742799 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:30:52 np0005470441 nova_compute[192626]: 2025-10-04 05:30:52.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:52 np0005470441 nova_compute[192626]: 2025-10-04 05:30:52.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.742 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.743 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.774 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.775 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.775 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.776 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.944 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.946 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6105MB free_disk=73.50175094604492GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.946 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:30:53 np0005470441 nova_compute[192626]: 2025-10-04 05:30:53.947 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:30:54 np0005470441 nova_compute[192626]: 2025-10-04 05:30:54.030 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:30:54 np0005470441 nova_compute[192626]: 2025-10-04 05:30:54.030 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:30:54 np0005470441 nova_compute[192626]: 2025-10-04 05:30:54.057 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:30:54 np0005470441 nova_compute[192626]: 2025-10-04 05:30:54.085 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:30:54 np0005470441 nova_compute[192626]: 2025-10-04 05:30:54.088 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:30:54 np0005470441 nova_compute[192626]: 2025-10-04 05:30:54.088 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:30:54 np0005470441 podman[219975]: 2025-10-04 05:30:54.37123452 +0000 UTC m=+0.118178405 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:30:55 np0005470441 nova_compute[192626]: 2025-10-04 05:30:55.062 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:55 np0005470441 nova_compute[192626]: 2025-10-04 05:30:55.062 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:55 np0005470441 nova_compute[192626]: 2025-10-04 05:30:55.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:55 np0005470441 nova_compute[192626]: 2025-10-04 05:30:55.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:56 np0005470441 nova_compute[192626]: 2025-10-04 05:30:56.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:30:58 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:30:58.335 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:30:58 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:30:58.336 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:30:59 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:30:59.338 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:03 np0005470441 podman[220002]: 2025-10-04 05:31:03.314397877 +0000 UTC m=+0.057895965 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:31:03 np0005470441 podman[220001]: 2025-10-04 05:31:03.338957787 +0000 UTC m=+0.085253805 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  4 01:31:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:06.735 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:06.736 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:06.736 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:08 np0005470441 podman[220043]: 2025-10-04 05:31:08.297812998 +0000 UTC m=+0.052105969 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  4 01:31:08 np0005470441 podman[220044]: 2025-10-04 05:31:08.305474555 +0000 UTC m=+0.051969785 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:31:17 np0005470441 podman[220082]: 2025-10-04 05:31:17.312905426 +0000 UTC m=+0.064584151 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6)
Oct  4 01:31:19 np0005470441 podman[220103]: 2025-10-04 05:31:19.291282661 +0000 UTC m=+0.048117440 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:31:23 np0005470441 podman[220127]: 2025-10-04 05:31:23.298187849 +0000 UTC m=+0.050006651 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true)
Oct  4 01:31:25 np0005470441 podman[220146]: 2025-10-04 05:31:25.329256742 +0000 UTC m=+0.086467836 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller)
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.049 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.049 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.069 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.191 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.192 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.196 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.197 2 INFO nova.compute.claims [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.317 2 DEBUG nova.compute.provider_tree [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.335 2 DEBUG nova.scheduler.client.report [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.359 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.360 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.430 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.430 2 DEBUG nova.network.neutron [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.472 2 INFO nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.501 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.627 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.629 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.630 2 INFO nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Creating image(s)#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.630 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.631 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.632 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.632 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:28 np0005470441 nova_compute[192626]: 2025-10-04 05:31:28.633 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:29 np0005470441 nova_compute[192626]: 2025-10-04 05:31:29.374 2 WARNING oslo_policy.policy [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  4 01:31:29 np0005470441 nova_compute[192626]: 2025-10-04 05:31:29.375 2 WARNING oslo_policy.policy [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  4 01:31:29 np0005470441 nova_compute[192626]: 2025-10-04 05:31:29.378 2 DEBUG nova.policy [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.414 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.506 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.part --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.507 2 DEBUG nova.virt.images [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] 2b7414ad-3419-4b92-8471-b72003f69821 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.509 2 DEBUG nova.privsep.utils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.509 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.part /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.915 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.part /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.converted" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:30 np0005470441 nova_compute[192626]: 2025-10-04 05:31:30.919 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.002 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e.converted --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.003 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.016 2 INFO oslo.privsep.daemon [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmplz0xa631/privsep.sock']#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.709 2 INFO oslo.privsep.daemon [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.578 55 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.582 55 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.584 55 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.584 55 INFO oslo.privsep.daemon [-] privsep daemon running as pid 55#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.785 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.836 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.837 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.838 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.850 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.900 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.901 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.976 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk 1073741824" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.977 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:31 np0005470441 nova_compute[192626]: 2025-10-04 05:31:31.978 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.026 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.027 2 DEBUG nova.virt.disk.api [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Checking if we can resize image /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.028 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.046 2 DEBUG nova.network.neutron [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Successfully created port: 1b5b60bd-2531-4381-84cd-eb569ec9274c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.086 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.086 2 DEBUG nova.virt.disk.api [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Cannot resize image /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.087 2 DEBUG nova.objects.instance [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'migration_context' on Instance uuid a1baa49c-f428-4e4d-801c-abc2136158a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.104 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.105 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Ensure instance console log exists: /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.105 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.106 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:32 np0005470441 nova_compute[192626]: 2025-10-04 05:31:32.106 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:34 np0005470441 podman[220210]: 2025-10-04 05:31:34.30332847 +0000 UTC m=+0.060123167 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:31:34 np0005470441 podman[220211]: 2025-10-04 05:31:34.3061138 +0000 UTC m=+0.055006212 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:31:35 np0005470441 nova_compute[192626]: 2025-10-04 05:31:35.661 2 DEBUG nova.network.neutron [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Successfully updated port: 1b5b60bd-2531-4381-84cd-eb569ec9274c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:31:35 np0005470441 nova_compute[192626]: 2025-10-04 05:31:35.678 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:31:35 np0005470441 nova_compute[192626]: 2025-10-04 05:31:35.678 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquired lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:31:35 np0005470441 nova_compute[192626]: 2025-10-04 05:31:35.679 2 DEBUG nova.network.neutron [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:31:36 np0005470441 nova_compute[192626]: 2025-10-04 05:31:36.014 2 DEBUG nova.network.neutron [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:31:36 np0005470441 nova_compute[192626]: 2025-10-04 05:31:36.514 2 DEBUG nova.compute.manager [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received event network-changed-1b5b60bd-2531-4381-84cd-eb569ec9274c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:31:36 np0005470441 nova_compute[192626]: 2025-10-04 05:31:36.515 2 DEBUG nova.compute.manager [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Refreshing instance network info cache due to event network-changed-1b5b60bd-2531-4381-84cd-eb569ec9274c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:31:36 np0005470441 nova_compute[192626]: 2025-10-04 05:31:36.515 2 DEBUG oslo_concurrency.lockutils [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.924 2 DEBUG nova.network.neutron [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updating instance_info_cache with network_info: [{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.958 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Releasing lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.958 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Instance network_info: |[{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.959 2 DEBUG oslo_concurrency.lockutils [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.959 2 DEBUG nova.network.neutron [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Refreshing network info cache for port 1b5b60bd-2531-4381-84cd-eb569ec9274c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.961 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Start _get_guest_xml network_info=[{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.965 2 WARNING nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.975 2 DEBUG nova.virt.libvirt.host [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.975 2 DEBUG nova.virt.libvirt.host [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.986 2 DEBUG nova.virt.libvirt.host [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.987 2 DEBUG nova.virt.libvirt.host [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.989 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.990 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.990 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.990 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.991 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.991 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.991 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.991 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.991 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.992 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.992 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.992 2 DEBUG nova.virt.hardware [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.996 2 DEBUG nova.privsep.utils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.996 2 DEBUG nova.virt.libvirt.vif [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-675739622',display_name='tempest-TestNetworkAdvancedServerOps-server-675739622',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-675739622',id=2,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP5+CRSM3/W/3cdhXTvJjWK5UqBbkM4aujgf+ON1jBkOSGjcEuxVD5W29TRWk+OUAt6wyZdunDlHRBm9PNDcqsoaG2HxeOcc3JYO5cd3/bCy4UrUPgcVg69owsQ+Gy2tQQ==',key_name='tempest-TestNetworkAdvancedServerOps-2087633968',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-vf0w3tlq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:31:28Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=a1baa49c-f428-4e4d-801c-abc2136158a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.997 2 DEBUG nova.network.os_vif_util [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.997 2 DEBUG nova.network.os_vif_util [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:31:37 np0005470441 nova_compute[192626]: 2025-10-04 05:31:37.999 2 DEBUG nova.objects.instance [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_devices' on Instance uuid a1baa49c-f428-4e4d-801c-abc2136158a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.021 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <uuid>a1baa49c-f428-4e4d-801c-abc2136158a1</uuid>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <name>instance-00000002</name>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-675739622</nova:name>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:31:37</nova:creationTime>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:user uuid="d65c768451494a3f9e4f9a238fa5c40d">tempest-TestNetworkAdvancedServerOps-1635331179-project-member</nova:user>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:project uuid="d0c087ea0f62444e80490916b42c760f">tempest-TestNetworkAdvancedServerOps-1635331179</nova:project>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        <nova:port uuid="1b5b60bd-2531-4381-84cd-eb569ec9274c">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <entry name="serial">a1baa49c-f428-4e4d-801c-abc2136158a1</entry>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <entry name="uuid">a1baa49c-f428-4e4d-801c-abc2136158a1</entry>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:d1:d1:e0"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <target dev="tap1b5b60bd-25"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/console.log" append="off"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:31:38 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:31:38 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:31:38 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:31:38 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.023 2 DEBUG nova.compute.manager [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Preparing to wait for external event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.023 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.023 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.024 2 DEBUG oslo_concurrency.lockutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.024 2 DEBUG nova.virt.libvirt.vif [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-675739622',display_name='tempest-TestNetworkAdvancedServerOps-server-675739622',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-675739622',id=2,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP5+CRSM3/W/3cdhXTvJjWK5UqBbkM4aujgf+ON1jBkOSGjcEuxVD5W29TRWk+OUAt6wyZdunDlHRBm9PNDcqsoaG2HxeOcc3JYO5cd3/bCy4UrUPgcVg69owsQ+Gy2tQQ==',key_name='tempest-TestNetworkAdvancedServerOps-2087633968',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-vf0w3tlq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:31:28Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=a1baa49c-f428-4e4d-801c-abc2136158a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.025 2 DEBUG nova.network.os_vif_util [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.025 2 DEBUG nova.network.os_vif_util [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.026 2 DEBUG os_vif [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.063 2 DEBUG ovsdbapp.backend.ovs_idl [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.064 2 DEBUG ovsdbapp.backend.ovs_idl [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.064 2 DEBUG ovsdbapp.backend.ovs_idl [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.083 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.083 2 INFO oslo.privsep.daemon [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp6ai11twq/privsep.sock']#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.766 2 INFO oslo.privsep.daemon [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.650 76 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.655 76 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.659 76 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  4 01:31:38 np0005470441 nova_compute[192626]: 2025-10-04 05:31:38.659 76 INFO oslo.privsep.daemon [-] privsep daemon running as pid 76#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.102 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b5b60bd-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.103 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b5b60bd-25, col_values=(('external_ids', {'iface-id': '1b5b60bd-2531-4381-84cd-eb569ec9274c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:d1:e0', 'vm-uuid': 'a1baa49c-f428-4e4d-801c-abc2136158a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:39 np0005470441 NetworkManager[51690]: <info>  [1759555899.1062] manager: (tap1b5b60bd-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.113 2 INFO os_vif [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25')#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.191 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.191 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.191 2 DEBUG nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No VIF found with MAC fa:16:3e:d1:d1:e0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:31:39 np0005470441 nova_compute[192626]: 2025-10-04 05:31:39.192 2 INFO nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Using config drive#033[00m
Oct  4 01:31:39 np0005470441 podman[220262]: 2025-10-04 05:31:39.308644249 +0000 UTC m=+0.059220458 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:31:39 np0005470441 podman[220263]: 2025-10-04 05:31:39.339387269 +0000 UTC m=+0.086276510 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm)
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.666 2 INFO nova.virt.libvirt.driver [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Creating config drive at /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config#033[00m
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.671 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpejdsyw5u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.793 2 DEBUG oslo_concurrency.processutils [None req-49c972e4-f167-4689-ac43-06512d662cd6 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpejdsyw5u" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:40 np0005470441 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  4 01:31:40 np0005470441 kernel: tap1b5b60bd-25: entered promiscuous mode
Oct  4 01:31:40 np0005470441 NetworkManager[51690]: <info>  [1759555900.8687] manager: (tap1b5b60bd-25): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Oct  4 01:31:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:40Z|00027|binding|INFO|Claiming lport 1b5b60bd-2531-4381-84cd-eb569ec9274c for this chassis.
Oct  4 01:31:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:40Z|00028|binding|INFO|1b5b60bd-2531-4381-84cd-eb569ec9274c: Claiming fa:16:3e:d1:d1:e0 10.100.0.8
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:40.888 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:d1:e0 10.100.0.8'], port_security=['fa:16:3e:d1:d1:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ce69795d-c8bb-4412-99cd-26423ff2a719', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93ea15ef-b67e-4d79-b2d8-fce7c9643ed0, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1b5b60bd-2531-4381-84cd-eb569ec9274c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:31:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:40.889 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1b5b60bd-2531-4381-84cd-eb569ec9274c in datapath 1672dacf-b95d-4a80-9b7d-b30bde70ba8b bound to our chassis#033[00m
Oct  4 01:31:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:40.893 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1672dacf-b95d-4a80-9b7d-b30bde70ba8b#033[00m
Oct  4 01:31:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:40.895 103689 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpbag9ualq/privsep.sock']#033[00m
Oct  4 01:31:40 np0005470441 systemd-udevd[220322]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:31:40 np0005470441 NetworkManager[51690]: <info>  [1759555900.9184] device (tap1b5b60bd-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:31:40 np0005470441 NetworkManager[51690]: <info>  [1759555900.9191] device (tap1b5b60bd-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:31:40 np0005470441 systemd-machined[152624]: New machine qemu-1-instance-00000002.
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:40Z|00029|binding|INFO|Setting lport 1b5b60bd-2531-4381-84cd-eb569ec9274c ovn-installed in OVS
Oct  4 01:31:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:40Z|00030|binding|INFO|Setting lport 1b5b60bd-2531-4381-84cd-eb569ec9274c up in Southbound
Oct  4 01:31:40 np0005470441 nova_compute[192626]: 2025-10-04 05:31:40.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:40 np0005470441 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.419 2 DEBUG nova.network.neutron [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updated VIF entry in instance network info cache for port 1b5b60bd-2531-4381-84cd-eb569ec9274c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.421 2 DEBUG nova.network.neutron [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updating instance_info_cache with network_info: [{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.448 2 DEBUG oslo_concurrency.lockutils [req-a85a7646-3103-4bd9-a622-4ef7ded66956 req-9ae043ab-bc3a-4510-b75a-05b670879735 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.583 103689 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.583 103689 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbag9ualq/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.472 220349 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.476 220349 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.478 220349 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.478 220349 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220349#033[00m
Oct  4 01:31:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:41.586 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[34bfa87a-6a7c-46de-abde-49c023cb7332]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.692 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759555901.692605, a1baa49c-f428-4e4d-801c-abc2136158a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.694 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] VM Started (Lifecycle Event)#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.745 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.750 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759555901.6933534, a1baa49c-f428-4e4d-801c-abc2136158a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.751 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.787 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.791 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:31:41 np0005470441 nova_compute[192626]: 2025-10-04 05:31:41.818 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:31:42 np0005470441 nova_compute[192626]: 2025-10-04 05:31:42.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.082 220349 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.082 220349 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.082 220349 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.654 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd20d05-b2b1-4037-a317-d8bd4c64c540]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.655 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1672dacf-b1 in ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.657 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1672dacf-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.657 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aabbd6ca-4eee-4274-a917-f15e87cc76f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.660 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[54979728-b992-44f3-b2bb-d828da2bfb77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.679 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4fb462-4e36-4cb7-8f49-5696b168c5cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.693 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9c1725af-dfe0-44a8-91a9-b5a89e724a53]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:42.695 103689 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp3uudmbhj/privsep.sock']#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.394 103689 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.394 103689 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3uudmbhj/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.264 220363 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.267 220363 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.269 220363 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.269 220363 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220363#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.396 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[107b505f-abe5-400f-9ae7-58bf60f4fd24]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.899 220363 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.900 220363 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:43.900 220363 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:44 np0005470441 nova_compute[192626]: 2025-10-04 05:31:44.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.497 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[9fab8554-b53c-4cbb-ae9e-6aa9caed7740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 NetworkManager[51690]: <info>  [1759555904.5024] manager: (tap1672dacf-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.501 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7535a31e-1a3a-4c4c-8476-4de829fe6617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 systemd-udevd[220373]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.531 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[51fd1fd2-5744-4375-9982-1d758519c236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.534 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[893f3f7f-ef74-4769-8c52-cb08fa83d236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 NetworkManager[51690]: <info>  [1759555904.5613] device (tap1672dacf-b0): carrier: link connected
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.563 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[46c993b2-ae3c-4cd3-8160-3260f5739afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.580 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b222af17-d696-43d5-99f7-083f9bc8d227]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1672dacf-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:bc:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380221, 'reachable_time': 20101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220392, 'error': None, 'target': 'ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.593 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ddef1e81-8f8c-48a9-a1c6-2b5bb00c5efd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4c:bc90'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 380221, 'tstamp': 380221}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220393, 'error': None, 'target': 'ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.609 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[dde61386-be09-4d3a-af97-fd64641fd59c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1672dacf-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4c:bc:90'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380221, 'reachable_time': 20101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220394, 'error': None, 'target': 'ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.637 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[71f5f425-456e-45b8-a288-46ff2c4d2adc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.683 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ae80fd52-e732-4c40-b2ae-60d51386e86e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.684 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1672dacf-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.685 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.685 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1672dacf-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:44 np0005470441 nova_compute[192626]: 2025-10-04 05:31:44.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:44 np0005470441 NetworkManager[51690]: <info>  [1759555904.6879] manager: (tap1672dacf-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Oct  4 01:31:44 np0005470441 kernel: tap1672dacf-b0: entered promiscuous mode
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.690 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1672dacf-b0, col_values=(('external_ids', {'iface-id': '9b1c9b1f-2f28-469a-9424-d41454001476'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:44 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:44Z|00031|binding|INFO|Releasing lport 9b1c9b1f-2f28-469a-9424-d41454001476 from this chassis (sb_readonly=0)
Oct  4 01:31:44 np0005470441 nova_compute[192626]: 2025-10-04 05:31:44.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.693 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1672dacf-b95d-4a80-9b7d-b30bde70ba8b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1672dacf-b95d-4a80-9b7d-b30bde70ba8b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.693 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9afe7a77-2efe-418f-9b9b-b051508471e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.695 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-1672dacf-b95d-4a80-9b7d-b30bde70ba8b
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/1672dacf-b95d-4a80-9b7d-b30bde70ba8b.pid.haproxy
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 1672dacf-b95d-4a80-9b7d-b30bde70ba8b
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:31:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:44.696 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'env', 'PROCESS_TAG=haproxy-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1672dacf-b95d-4a80-9b7d-b30bde70ba8b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:31:44 np0005470441 nova_compute[192626]: 2025-10-04 05:31:44.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:45 np0005470441 podman[220426]: 2025-10-04 05:31:45.060099783 +0000 UTC m=+0.046938683 container create 0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:31:45 np0005470441 systemd[1]: Started libpod-conmon-0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250.scope.
Oct  4 01:31:45 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:31:45 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f0ea3a409dc5053f3eddaf49f10ec20345692723951380c04c29e58c3961ce3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:31:45 np0005470441 podman[220426]: 2025-10-04 05:31:45.034372214 +0000 UTC m=+0.021211114 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:31:45 np0005470441 podman[220426]: 2025-10-04 05:31:45.1326789 +0000 UTC m=+0.119517820 container init 0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:31:45 np0005470441 podman[220426]: 2025-10-04 05:31:45.138161927 +0000 UTC m=+0.125000827 container start 0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:31:45 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [NOTICE]   (220445) : New worker (220447) forked
Oct  4 01:31:45 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [NOTICE]   (220445) : Loading success.
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.779 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "0ba50f2e-225f-45b8-9579-4a092ec91d7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.780 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "0ba50f2e-225f-45b8-9579-4a092ec91d7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.802 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.889 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.889 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.898 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:31:47 np0005470441 nova_compute[192626]: 2025-10-04 05:31:47.898 2 INFO nova.compute.claims [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.051 2 DEBUG nova.compute.provider_tree [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.095 2 ERROR nova.scheduler.client.report [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [req-7f33f4ee-e503-447f-8841-87e26337d21f] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 4baba3a8-b392-49ca-9421-92d7b50a939b.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-7f33f4ee-e503-447f-8841-87e26337d21f"}]}#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.115 2 DEBUG nova.scheduler.client.report [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.151 2 DEBUG nova.scheduler.client.report [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.152 2 DEBUG nova.compute.provider_tree [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.170 2 DEBUG nova.scheduler.client.report [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.206 2 DEBUG nova.scheduler.client.report [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:31:48 np0005470441 podman[220456]: 2025-10-04 05:31:48.307332018 +0000 UTC m=+0.063987321 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, io.openshift.tags=minimal rhel9)
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.334 2 DEBUG nova.compute.provider_tree [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.384 2 DEBUG nova.scheduler.client.report [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updated inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.385 2 DEBUG nova.compute.provider_tree [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updating resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.385 2 DEBUG nova.compute.provider_tree [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.414 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.524s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.415 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.484 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.485 2 DEBUG nova.network.neutron [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.506 2 INFO nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.527 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.663 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.665 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.665 2 INFO nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Creating image(s)#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.665 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "/var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.666 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "/var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.666 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "/var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.677 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.767 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.768 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.768 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.779 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.842 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.843 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.878 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.880 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.881 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.948 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.949 2 DEBUG nova.virt.disk.api [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Checking if we can resize image /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:31:48 np0005470441 nova_compute[192626]: 2025-10-04 05:31:48.950 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.008 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.009 2 DEBUG nova.virt.disk.api [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Cannot resize image /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.010 2 DEBUG nova.objects.instance [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lazy-loading 'migration_context' on Instance uuid 0ba50f2e-225f-45b8-9579-4a092ec91d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.028 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.028 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Ensure instance console log exists: /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.029 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.029 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.029 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:49 np0005470441 nova_compute[192626]: 2025-10-04 05:31:49.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:50 np0005470441 nova_compute[192626]: 2025-10-04 05:31:50.058 2 DEBUG nova.network.neutron [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Successfully created port: b0db5f91-922c-4900-9345-3d621d2437de _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:31:50 np0005470441 podman[220493]: 2025-10-04 05:31:50.312569139 +0000 UTC m=+0.053391981 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:31:52 np0005470441 nova_compute[192626]: 2025-10-04 05:31:52.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:52 np0005470441 nova_compute[192626]: 2025-10-04 05:31:52.266 2 DEBUG nova.network.neutron [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Successfully updated port: b0db5f91-922c-4900-9345-3d621d2437de _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:31:52 np0005470441 nova_compute[192626]: 2025-10-04 05:31:52.296 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "refresh_cache-0ba50f2e-225f-45b8-9579-4a092ec91d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:31:52 np0005470441 nova_compute[192626]: 2025-10-04 05:31:52.297 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquired lock "refresh_cache-0ba50f2e-225f-45b8-9579-4a092ec91d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:31:52 np0005470441 nova_compute[192626]: 2025-10-04 05:31:52.297 2 DEBUG nova.network.neutron [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:31:52 np0005470441 nova_compute[192626]: 2025-10-04 05:31:52.649 2 DEBUG nova.network.neutron [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:31:53 np0005470441 nova_compute[192626]: 2025-10-04 05:31:53.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.023 2 DEBUG nova.compute.manager [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Received event network-changed-b0db5f91-922c-4900-9345-3d621d2437de external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.023 2 DEBUG nova.compute.manager [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Refreshing instance network info cache due to event network-changed-b0db5f91-922c-4900-9345-3d621d2437de. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.024 2 DEBUG oslo_concurrency.lockutils [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-0ba50f2e-225f-45b8-9579-4a092ec91d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:54 np0005470441 podman[220518]: 2025-10-04 05:31:54.29226081 +0000 UTC m=+0.044193654 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.713 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.886 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.887 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.887 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:54 np0005470441 nova_compute[192626]: 2025-10-04 05:31:54.887 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.077 2 DEBUG nova.network.neutron [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Updating instance_info_cache with network_info: [{"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.110 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.200 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.201 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.267 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.409 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.410 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5742MB free_disk=73.46671295166016GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.411 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.411 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.432 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Releasing lock "refresh_cache-0ba50f2e-225f-45b8-9579-4a092ec91d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.433 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Instance network_info: |[{"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.433 2 DEBUG oslo_concurrency.lockutils [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-0ba50f2e-225f-45b8-9579-4a092ec91d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.433 2 DEBUG nova.network.neutron [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Refreshing network info cache for port b0db5f91-922c-4900-9345-3d621d2437de _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.437 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Start _get_guest_xml network_info=[{"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.440 2 WARNING nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.443 2 DEBUG nova.virt.libvirt.host [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.444 2 DEBUG nova.virt.libvirt.host [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.446 2 DEBUG nova.virt.libvirt.host [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.447 2 DEBUG nova.virt.libvirt.host [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.448 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.448 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.448 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.449 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.449 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.449 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.449 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.450 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.450 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.450 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.450 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.450 2 DEBUG nova.virt.hardware [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.453 2 DEBUG nova.virt.libvirt.vif [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:31:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-145592455',display_name='tempest-TestServerMultinode-server-145592455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-145592455',id=5,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4e8bd7ec9b14151a805796b3de01401',ramdisk_id='',reservation_id='r-pb0gbtyc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1750367848',owner_user_name='tempest-TestServerMultinode-1750367848-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:31:48Z,user_data=None,user_id='162446113f4b4dd5aed5e211cf8cdc28',uuid=0ba50f2e-225f-45b8-9579-4a092ec91d7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.454 2 DEBUG nova.network.os_vif_util [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Converting VIF {"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.454 2 DEBUG nova.network.os_vif_util [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:90:f1,bridge_name='br-int',has_traffic_filtering=True,id=b0db5f91-922c-4900-9345-3d621d2437de,network=Network(5e08e8a1-cc23-408f-a3cc-36e42d124bec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db5f91-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.455 2 DEBUG nova.objects.instance [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0ba50f2e-225f-45b8-9579-4a092ec91d7d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.562 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <uuid>0ba50f2e-225f-45b8-9579-4a092ec91d7d</uuid>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <name>instance-00000005</name>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestServerMultinode-server-145592455</nova:name>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:31:55</nova:creationTime>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:user uuid="162446113f4b4dd5aed5e211cf8cdc28">tempest-TestServerMultinode-1750367848-project-admin</nova:user>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:project uuid="b4e8bd7ec9b14151a805796b3de01401">tempest-TestServerMultinode-1750367848</nova:project>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        <nova:port uuid="b0db5f91-922c-4900-9345-3d621d2437de">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <entry name="serial">0ba50f2e-225f-45b8-9579-4a092ec91d7d</entry>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <entry name="uuid">0ba50f2e-225f-45b8-9579-4a092ec91d7d</entry>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.config"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:45:90:f1"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <target dev="tapb0db5f91-92"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/console.log" append="off"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:31:55 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:31:55 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:31:55 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:31:55 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.564 2 DEBUG nova.compute.manager [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Preparing to wait for external event network-vif-plugged-b0db5f91-922c-4900-9345-3d621d2437de prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.564 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Acquiring lock "0ba50f2e-225f-45b8-9579-4a092ec91d7d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.565 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "0ba50f2e-225f-45b8-9579-4a092ec91d7d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.565 2 DEBUG oslo_concurrency.lockutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Lock "0ba50f2e-225f-45b8-9579-4a092ec91d7d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.566 2 DEBUG nova.virt.libvirt.vif [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:31:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-145592455',display_name='tempest-TestServerMultinode-server-145592455',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-145592455',id=5,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b4e8bd7ec9b14151a805796b3de01401',ramdisk_id='',reservation_id='r-pb0gbtyc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1750367848',owner_user_name='tempest-TestServerMultinode-1750367848-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:31:48Z,user_data=None,user_id='162446113f4b4dd5aed5e211cf8cdc28',uuid=0ba50f2e-225f-45b8-9579-4a092ec91d7d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.566 2 DEBUG nova.network.os_vif_util [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Converting VIF {"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.567 2 DEBUG nova.network.os_vif_util [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:90:f1,bridge_name='br-int',has_traffic_filtering=True,id=b0db5f91-922c-4900-9345-3d621d2437de,network=Network(5e08e8a1-cc23-408f-a3cc-36e42d124bec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db5f91-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.567 2 DEBUG os_vif [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:90:f1,bridge_name='br-int',has_traffic_filtering=True,id=b0db5f91-922c-4900-9345-3d621d2437de,network=Network(5e08e8a1-cc23-408f-a3cc-36e42d124bec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db5f91-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.572 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.572 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.575 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb0db5f91-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.576 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb0db5f91-92, col_values=(('external_ids', {'iface-id': 'b0db5f91-922c-4900-9345-3d621d2437de', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:90:f1', 'vm-uuid': '0ba50f2e-225f-45b8-9579-4a092ec91d7d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:55 np0005470441 NetworkManager[51690]: <info>  [1759555915.5781] manager: (tapb0db5f91-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.584 2 INFO os_vif [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:90:f1,bridge_name='br-int',has_traffic_filtering=True,id=b0db5f91-922c-4900-9345-3d621d2437de,network=Network(5e08e8a1-cc23-408f-a3cc-36e42d124bec),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0db5f91-92')#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.600 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance a1baa49c-f428-4e4d-801c-abc2136158a1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.601 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 0ba50f2e-225f-45b8-9579-4a092ec91d7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.601 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.601 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.680 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.778 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.786 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.786 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.787 2 DEBUG nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] No VIF found with MAC fa:16:3e:45:90:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.787 2 INFO nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Using config drive#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.882 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:31:55 np0005470441 nova_compute[192626]: 2025-10-04 05:31:55.882 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.472s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:31:56 np0005470441 podman[220546]: 2025-10-04 05:31:56.332628673 +0000 UTC m=+0.089504753 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true)
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.883 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.883 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.883 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.949 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.949 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.949 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.950 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.950 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:56 np0005470441 nova_compute[192626]: 2025-10-04 05:31:56.950 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.312 2 INFO nova.virt.libvirt.driver [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Creating config drive at /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.config#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.321 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx6rxknjp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.446 2 DEBUG oslo_concurrency.processutils [None req-edf27560-3f68-456e-b58c-91a170961755 162446113f4b4dd5aed5e211cf8cdc28 b4e8bd7ec9b14151a805796b3de01401 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx6rxknjp" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:31:57 np0005470441 NetworkManager[51690]: <info>  [1759555917.4980] manager: (tapb0db5f91-92): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Oct  4 01:31:57 np0005470441 kernel: tapb0db5f91-92: entered promiscuous mode
Oct  4 01:31:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:57Z|00032|binding|INFO|Claiming lport b0db5f91-922c-4900-9345-3d621d2437de for this chassis.
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:57Z|00033|binding|INFO|b0db5f91-922c-4900-9345-3d621d2437de: Claiming fa:16:3e:45:90:f1 10.100.0.6
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.517 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:90:f1 10.100.0.6'], port_security=['fa:16:3e:45:90:f1 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5e08e8a1-cc23-408f-a3cc-36e42d124bec', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e9e9f975-90df-42ae-87df-7451e439c0f8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a5793b85-7bb0-474d-9705-5081c2ba80bd, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=b0db5f91-922c-4900-9345-3d621d2437de) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.519 103689 INFO neutron.agent.ovn.metadata.agent [-] Port b0db5f91-922c-4900-9345-3d621d2437de in datapath 5e08e8a1-cc23-408f-a3cc-36e42d124bec bound to our chassis#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.522 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5e08e8a1-cc23-408f-a3cc-36e42d124bec#033[00m
Oct  4 01:31:57 np0005470441 systemd-udevd[220590]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.533 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1d361f36-110d-4b41-a1a9-9ce4d31acf33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.534 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5e08e8a1-c1 in ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.536 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5e08e8a1-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.536 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba64b1c-872a-4108-83c3-a052f83a337e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.537 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9f80dc1d-fed8-4a3a-a27e-916e2f50f747]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 NetworkManager[51690]: <info>  [1759555917.5440] device (tapb0db5f91-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:31:57 np0005470441 NetworkManager[51690]: <info>  [1759555917.5452] device (tapb0db5f91-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:31:57 np0005470441 systemd-machined[152624]: New machine qemu-2-instance-00000005.
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.560 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[10595d57-9be0-4536-a8e8-5b959c40ad30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:57Z|00034|binding|INFO|Setting lport b0db5f91-922c-4900-9345-3d621d2437de ovn-installed in OVS
Oct  4 01:31:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:57Z|00035|binding|INFO|Setting lport b0db5f91-922c-4900-9345-3d621d2437de up in Southbound
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.574 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[41562751-70d7-49ac-bc59-8ca5aef97288]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.605 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[6c8c9d79-ff6d-4c50-aa57-83eb53847fca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.609 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c42098-f34d-416d-ac17-163c4e2ae357]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 NetworkManager[51690]: <info>  [1759555917.6104] manager: (tap5e08e8a1-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.635 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[fe5274e6-a67e-45af-b6f3-7db86ec9ce6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.637 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[6beef5b7-2bae-4f82-a6c7-f07f7ae7b95b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 NetworkManager[51690]: <info>  [1759555917.6604] device (tap5e08e8a1-c0): carrier: link connected
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.665 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[511eef4d-77ef-461b-afde-4f0849391f37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.678 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0527dc2b-9f41-4655-bdae-953272ec3a6b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e08e8a1-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:ca:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381531, 'reachable_time': 40666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220624, 'error': None, 'target': 'ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.690 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f767e6ba-8b1d-41be-a332-802d9221c8c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe78:ca23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 381531, 'tstamp': 381531}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220625, 'error': None, 'target': 'ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.703 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[06a1a009-9b10-4005-8864-1be9b3ba3f5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5e08e8a1-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:ca:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 381531, 'reachable_time': 40666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220627, 'error': None, 'target': 'ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.732 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[be5e8558-4f44-4176-8b88-f87b71419c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.791 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[079df81f-1c07-49aa-8b70-37f6665c41f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.793 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e08e8a1-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.793 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.794 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e08e8a1-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 kernel: tap5e08e8a1-c0: entered promiscuous mode
Oct  4 01:31:57 np0005470441 NetworkManager[51690]: <info>  [1759555917.7971] manager: (tap5e08e8a1-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.800 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5e08e8a1-c0, col_values=(('external_ids', {'iface-id': 'db9028df-71fe-41c7-b2ab-4ee0cd55cd60'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:31:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:31:57Z|00036|binding|INFO|Releasing lport db9028df-71fe-41c7-b2ab-4ee0cd55cd60 from this chassis (sb_readonly=0)
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 nova_compute[192626]: 2025-10-04 05:31:57.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.820 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5e08e8a1-cc23-408f-a3cc-36e42d124bec.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5e08e8a1-cc23-408f-a3cc-36e42d124bec.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.820 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff42d94-48a5-4cb1-801a-61b8419900e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.821 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-5e08e8a1-cc23-408f-a3cc-36e42d124bec
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/5e08e8a1-cc23-408f-a3cc-36e42d124bec.pid.haproxy
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 5e08e8a1-cc23-408f-a3cc-36e42d124bec
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:31:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:31:57.822 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec', 'env', 'PROCESS_TAG=haproxy-5e08e8a1-cc23-408f-a3cc-36e42d124bec', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5e08e8a1-cc23-408f-a3cc-36e42d124bec.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:31:58 np0005470441 podman[220665]: 2025-10-04 05:31:58.161665351 +0000 UTC m=+0.052376958 container create 8d4ccbaa728904138293b5d8a9f4c74d733e3c80464e8ae81222b67aff1d0ec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:31:58 np0005470441 systemd[1]: Started libpod-conmon-8d4ccbaa728904138293b5d8a9f4c74d733e3c80464e8ae81222b67aff1d0ec1.scope.
Oct  4 01:31:58 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:31:58 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a16ea81e22205806cb3724208b0dab8559569b2b61513fb6601e733e0e1c750c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.224 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759555918.2241375, 0ba50f2e-225f-45b8-9579-4a092ec91d7d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.225 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] VM Started (Lifecycle Event)#033[00m
Oct  4 01:31:58 np0005470441 podman[220665]: 2025-10-04 05:31:58.133881926 +0000 UTC m=+0.024593553 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:31:58 np0005470441 podman[220665]: 2025-10-04 05:31:58.234080702 +0000 UTC m=+0.124792339 container init 8d4ccbaa728904138293b5d8a9f4c74d733e3c80464e8ae81222b67aff1d0ec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:31:58 np0005470441 podman[220665]: 2025-10-04 05:31:58.239570329 +0000 UTC m=+0.130281926 container start 8d4ccbaa728904138293b5d8a9f4c74d733e3c80464e8ae81222b67aff1d0ec1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:31:58 np0005470441 neutron-haproxy-ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec[220680]: [NOTICE]   (220684) : New worker (220686) forked
Oct  4 01:31:58 np0005470441 neutron-haproxy-ovnmeta-5e08e8a1-cc23-408f-a3cc-36e42d124bec[220680]: [NOTICE]   (220684) : Loading success.
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.258 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.262 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759555918.2242198, 0ba50f2e-225f-45b8-9579-4a092ec91d7d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.262 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.531 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.537 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:31:58 np0005470441 nova_compute[192626]: 2025-10-04 05:31:58.714 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:32:00 np0005470441 nova_compute[192626]: 2025-10-04 05:32:00.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:01 np0005470441 nova_compute[192626]: 2025-10-04 05:32:01.468 2 DEBUG nova.network.neutron [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Updated VIF entry in instance network info cache for port b0db5f91-922c-4900-9345-3d621d2437de. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:32:01 np0005470441 nova_compute[192626]: 2025-10-04 05:32:01.469 2 DEBUG nova.network.neutron [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Updating instance_info_cache with network_info: [{"id": "b0db5f91-922c-4900-9345-3d621d2437de", "address": "fa:16:3e:45:90:f1", "network": {"id": "5e08e8a1-cc23-408f-a3cc-36e42d124bec", "bridge": "br-int", "label": "tempest-TestServerMultinode-528545247-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "96a09323e9a546459e4909abafb753ad", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0db5f91-92", "ovs_interfaceid": "b0db5f91-922c-4900-9345-3d621d2437de", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:32:01 np0005470441 nova_compute[192626]: 2025-10-04 05:32:01.498 2 DEBUG oslo_concurrency.lockutils [req-ef3d3092-f214-4a4e-912d-088369060520 req-29f27869-ec90-4b3a-8813-f1d9d1eb6e41 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-0ba50f2e-225f-45b8-9579-4a092ec91d7d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:32:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:01.499 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:32:01 np0005470441 nova_compute[192626]: 2025-10-04 05:32:01.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:01.500 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:32:02 np0005470441 nova_compute[192626]: 2025-10-04 05:32:02.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:02 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:02.503 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:03 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:03.055 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}de8229276c4809ff85fcb8c6af4b9585394568f21277b2bce7cdd4db44db18a4" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.578 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.579 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.599 2 DEBUG nova.compute.manager [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:32:04 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:04.687 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Sat, 04 Oct 2025 05:32:03 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-89f6d25a-e3b3-4a02-adbc-20c55fd0a3c2 x-openstack-request-id: req-89f6d25a-e3b3-4a02-adbc-20c55fd0a3c2 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  4 01:32:04 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:04.687 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "557d8681-fd0f-46b9-8cde-c2bcdc3f068c", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/557d8681-fd0f-46b9-8cde-c2bcdc3f068c"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/557d8681-fd0f-46b9-8cde-c2bcdc3f068c"}]}, {"id": "9585bc8c-c7a8-4928-b67c-bb6035012f8e", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9585bc8c-c7a8-4928-b67c-bb6035012f8e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9585bc8c-c7a8-4928-b67c-bb6035012f8e"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  4 01:32:04 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:04.687 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-89f6d25a-e3b3-4a02-adbc-20c55fd0a3c2 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  4 01:32:04 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:04.689 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/9585bc8c-c7a8-4928-b67c-bb6035012f8e -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}de8229276c4809ff85fcb8c6af4b9585394568f21277b2bce7cdd4db44db18a4" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.719 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.719 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.726 2 DEBUG nova.virt.hardware [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:32:04 np0005470441 nova_compute[192626]: 2025-10-04 05:32:04.727 2 INFO nova.compute.claims [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:32:05 np0005470441 podman[220696]: 2025-10-04 05:32:05.303854603 +0000 UTC m=+0.055964738 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:32:05 np0005470441 podman[220695]: 2025-10-04 05:32:05.309240415 +0000 UTC m=+0.062078294 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.363 2 DEBUG nova.compute.provider_tree [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.412 2 DEBUG nova.scheduler.client.report [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.450 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.451 2 DEBUG nova.compute.manager [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.526 2 DEBUG nova.compute.manager [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.527 2 DEBUG nova.network.neutron [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.554 2 INFO nova.virt.libvirt.driver [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.581 2 DEBUG nova.compute.manager [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.736 2 DEBUG nova.compute.manager [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.738 2 DEBUG nova.virt.libvirt.driver [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.739 2 INFO nova.virt.libvirt.driver [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Creating image(s)#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.739 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "/var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.740 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "/var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.741 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "/var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.760 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.815 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.816 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.817 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.831 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.886 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.887 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.922 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.922 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:05 np0005470441 nova_compute[192626]: 2025-10-04 05:32:05.923 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.010 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.011 2 DEBUG nova.virt.disk.api [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Checking if we can resize image /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.011 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.086 2 DEBUG oslo_concurrency.processutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.087 2 DEBUG nova.virt.disk.api [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Cannot resize image /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.088 2 DEBUG nova.objects.instance [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lazy-loading 'migration_context' on Instance uuid b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.102 2 DEBUG nova.virt.libvirt.driver [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.103 2 DEBUG nova.virt.libvirt.driver [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Ensure instance console log exists: /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.103 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.104 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.104 2 DEBUG oslo_concurrency.lockutils [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:06 np0005470441 nova_compute[192626]: 2025-10-04 05:32:06.267 2 DEBUG nova.policy [None req-50c2c894-3830-4dad-89ed-92b218603a89 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4d92f636431b4823991511cd92b89378', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd524d98814ee446a9dd1280968a7744c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.480 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Sat, 04 Oct 2025 05:32:04 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-561ceb7e-d011-42c8-afb9-e95b56716fa5 x-openstack-request-id: req-561ceb7e-d011-42c8-afb9-e95b56716fa5 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.481 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "9585bc8c-c7a8-4928-b67c-bb6035012f8e", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/9585bc8c-c7a8-4928-b67c-bb6035012f8e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/9585bc8c-c7a8-4928-b67c-bb6035012f8e"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.481 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/9585bc8c-c7a8-4928-b67c-bb6035012f8e used request id req-561ceb7e-d011-42c8-afb9-e95b56716fa5 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.481 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'd0c087ea0f62444e80490916b42c760f', 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'hostId': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.483 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'name': 'tempest-TestServerMultinode-server-145592455', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000005', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': 'b4e8bd7ec9b14151a805796b3de01401', 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'hostId': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.484 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.487 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for a1baa49c-f428-4e4d-801c-abc2136158a1 / tap1b5b60bd-25 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.487 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.489 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0ba50f2e-225f-45b8-9579-4a092ec91d7d / tapb0db5f91-92 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.489 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c566b92-1686-4675-a955-7aaad16da4d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.484203', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '771cbeb4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': '3475e644813759769699c03fd859d231e539d7f17691a3b8d340cd6721604f86'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.484203', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '771d14b8-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': 'b22872015e89145aa9f9263ece3e7d904feb231b632306d3be2ed3361ff34ba2'}]}, 'timestamp': '2025-10-04 05:32:06.490163', '_unique_id': 'c493fb4d12a848ee8fe7549959ca78a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.494 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.497 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.510 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.510 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.520 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.520 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5d776bb-425a-4c38-8f4c-adab33d5faec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.497248', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '772038b4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.211881612, 'message_signature': '635309ba2fca22bfbe30e7488bab4c688f9eb7395ff1f0ca770ffc7f45957eed'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.497248', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77204836-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.211881612, 'message_signature': 'e45438b0a36ab11e07a41473ee0fddfcd239aaa62b8cb4e1e1198c30ea550f9f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.497248', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7721b3b0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.225709225, 'message_signature': '78f898059363fb5e10f6f395cc4c799eff48ebad4fe0d2b44bb16a3987d9ae2a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.497248', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7721c792-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.225709225, 'message_signature': '8bd360659f85f613b37bab895ef66a2f5a5433de851ad932f89b4c6cb6ae5108'}]}, 'timestamp': '2025-10-04 05:32:06.520997', '_unique_id': '34eff1c7995d4209b596b9fe4a790a2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.522 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.523 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.539 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.539 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance a1baa49c-f428-4e4d-801c-abc2136158a1: ceilometer.compute.pollsters.NoVolumeException
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.555 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.556 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 0ba50f2e-225f-45b8-9579-4a092ec91d7d: ceilometer.compute.pollsters.NoVolumeException
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.556 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.556 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.556 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.557 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.557 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b22e2692-c37d-4fef-86c0-e2b5692bd0ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.556474', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77273fe2-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.211881612, 'message_signature': '00718edb488ab96b0dc20801b6c0d241506b18cdfa73410efcdcec205eaefac6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.556474', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77274a6e-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.211881612, 'message_signature': 'fe61c72798ebe8d4b90e5370682303ffbdb3a204270f3ebdc59bdd5b9f39152b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.556474', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7727525c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.225709225, 'message_signature': 'cc7b3c59ff5cc81d2afd003218c7e07b6c24a6409ee1782d7a1af26bdcddac25'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.556474', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77275c5c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.225709225, 'message_signature': 'e1411cb73c55b52f3e0913f57a7f2a71d9edf15997af929a04d1bf3b4da29150'}]}, 'timestamp': '2025-10-04 05:32:06.557537', '_unique_id': '0b0e117a08614f648ead376fa22ae583'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.558 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.559 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.577 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.577 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.599 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.599 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30707e6f-307c-43a2-981a-7f00f552d803', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.559180', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '772a7734-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '192e57abaf550e2683790529956476a3ea901f25584d4314fcf2e821ae021c59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.559180', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '772a81d4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '6d6c70305ecbf73d56341d67a7f99cdf6fb75e423fcda39d98b8ec0b0ca601fb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.559180', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '772dc7e0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '9bd508a1165d861df68452e60b7423b827ce0cc2d228d6231e200fed76763a0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.559180', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '772dd3d4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '814d96619e32d185a279b4ce205fbcc2acc8a7df8526467d43b3a84100213bee'}]}, 'timestamp': '2025-10-04 05:32:06.599898', '_unique_id': '448d5e21786847c6b1f4f98625d39828'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.600 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.601 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.601 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.601 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>]
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.602 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.602 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.602 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74e44937-5167-4b97-b9c1-3ff636c0e4f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'timestamp': '2025-10-04T05:32:06.602171', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '772e3766-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.253981094, 'message_signature': '6e4776bccf1a0376a843d28a3a2e1ac5deb20dad562dcd1a61ce9845f37c52f0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'timestamp': '2025-10-04T05:32:06.602171', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '772e410c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.270377511, 'message_signature': '486e13565afb74f601d809da4e0c1f979e52837593425d45ebab6220068d2da0'}]}, 'timestamp': '2025-10-04 05:32:06.602679', '_unique_id': '1d933fc842a14e92950f204c2aabb940'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.603 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcff4266-7aaa-4a23-91f6-cc8848403030', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.603833', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '772e78ca-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': 'ff3f5f0f16a751158191b3af789c40025726846616a23669cf77c36245dcc769'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.603833', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '772e8388-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': '02d884172f4254f77ceaec61eda9240512d1f5c577145ce59e9c963e6a5015ea'}]}, 'timestamp': '2025-10-04 05:32:06.604386', '_unique_id': 'ec1a93b3afe34d63934b62aacd631666'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.604 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.605 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.605 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>]
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.605 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.605 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5faa156-1fb2-40b9-bec2-0ef478a68940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.605895', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '772ec8c0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': 'fb796c27b5e0ee364542af216c16566fc5b81404c52710512290d2ffcb67dae5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.605895', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '772ed1e4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': '7655b5ee0dd2700b6d4a9ee3b86fba5a1e41f035de41b12e000bc441cca44da1'}]}, 'timestamp': '2025-10-04 05:32:06.606400', '_unique_id': '871f460bfa394b30be380f82ee05a1b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.606 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.607 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.607 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.607 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42f3380f-9282-497f-b188-7624434ae832', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.607538', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '772f0902-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': 'd409f81c0f64d32a6639efdb7561690e79f3e4ae4a9a15c72364bbf89ba9f796'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.607538', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '772f13b6-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': 'c50280eb70bb9cc1ff8153c62935a243e4d12fcbc55afb2cec36fe5b8cf5408e'}]}, 'timestamp': '2025-10-04 05:32:06.608076', '_unique_id': 'd80d4adf18c1455f92447d4492e78365'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.608 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.609 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.609 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.609 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3e2d26a-3453-46f9-9971-b60490443e79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.609179', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '772f48cc-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': 'c228341b684e312941bc2e467ad1d9bb26ed970ad268775da5b4d7202b7b2a49'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.609179', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '772f5466-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': '44cdb5076d253b78c5cc059cf81370c71894d1e2eb7f37a9080058ca9fd8a36a'}]}, 'timestamp': '2025-10-04 05:32:06.609733', '_unique_id': '439e7b585ed0408e92dd6b88a877a930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.610 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6e73017-ccf1-4574-a3d9-56a085503838', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.610831', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '772f8968-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': 'ec69b335d79b3472a06e42ee4b2cca8abd2471d5b26f00eb5e97dc0580aa0a63'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.610831', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '772f941c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': '08a7ba20831f602b03030b930b5aed14e880d8f5574275dd6d201340771797ea'}]}, 'timestamp': '2025-10-04 05:32:06.611363', '_unique_id': '9c61c64313cf4e96b8074623be350124'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.611 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.612 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.612 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.612 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76784fd3-b88e-49b1-a1af-5c50d5fd07d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.612458', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '772fca22-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '91cc40e03519f6413baa29ad89f611ceefc4d514fa8352ae72cc0330fc2541a6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.612458', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '772fd49a-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '28167da92441fea6a4ef18b3f082f8c070a83d8efeb7b2a0deda93b5e2ca024d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.612458', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '772fdd14-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': 'c349cd017addfd4882a255d3c75968efb3b27af6322d9657ae6277060d51f2bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.612458', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '772fe570-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '4ca0c5cf6a4801927cc8d765657f113b0ab4e9fa0838c97dfbcf8c03ea51e266'}]}, 'timestamp': '2025-10-04 05:32:06.613445', '_unique_id': '5b5236c3ad584888bfdff09aee172cd7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.613 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.614 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.614 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.614 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d091645-0d22-4b43-aea4-81bda8322590', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.614607', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77301cf2-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': 'f4cb5ac3bbb39bbf15acecd17db1c2c6507a2b3ec99e9cc2ddc1e89eecadcf77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.614607', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773024f4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': 'a05e8bf656abf7809501922d601f9cf5bfcd3b81dd58db4f7459cdfee46dfe38'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.614607', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77302c42-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '4f97caa468d30ba4ad85c9e70bd443b04051fa1caa8a979236c0819cf211675c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.614607', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77303354-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '971f365de2b5ce6f0120a6dccbf4f5ea6d8c6a953f025ca2aca30dcdd474d221'}]}, 'timestamp': '2025-10-04 05:32:06.615421', '_unique_id': 'f6b966b0816549dba636abd2a6a568ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.615 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.616 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.616 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.616 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd19197a6-bef4-4379-a2e5-0e9584e3fd79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.616607', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77306982-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '4def9e53493a250d3a311583788aede94703b511b6985fcbba5941a51804bd2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.616607', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773071d4-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': 'e5771bbe514962d95cfa2a9d277cab7290d4e92534300b08634945d5926b2a56'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.616607', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77307ab2-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': 'b832c97bb47ad5929f005570764eba10dfcf0bf0acda3927190aeade2ed58539'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.616607', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7730839a-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '2a62a8f076714dfef7a2aef23e2ffa24859d1be3ce7fbfbacafffce59f32754f'}]}, 'timestamp': '2025-10-04 05:32:06.617499', '_unique_id': '08f64b1a939540cf9625b023fddd9875'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.617 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.618 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.618 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.618 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>]
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.618 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '41efccbe-8ba6-4b02-b437-11eea45c9b87', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.619000', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '7730c8a0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': 'd3cfda2ce4a1b62868105f0b85756655a457ff2b9600e0003105e4621de18165'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.619000', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '7730d228-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': 'ea17deb100ee26f9b8bbe55fd371b7b989c254e21373e119b71ba0b992443b0d'}]}, 'timestamp': '2025-10-04 05:32:06.619526', '_unique_id': 'ab0d5b7fb1aa474e98a010651f91f58d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.619 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.620 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.620 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.620 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27d1dc9f-3223-4195-893e-ab979ce00b28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.620629', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '7731086a-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': '384e30f2ebbe20dc1b97f304fd37b861da6efdcbcc7ca6a675a8a11d9c5d5172'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.620629', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '773112e2-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': '67799c1fbfa7a4c784f4a983cdbc9e428c4daddb4f7d714c5d72edea0edc1672'}]}, 'timestamp': '2025-10-04 05:32:06.621162', '_unique_id': '21773925a61f4f2db1c6f57cfc5adbdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.621 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.622 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.622 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.622 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.622 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b36dc7e-5645-4e25-835a-72438b5fed8b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.622395', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77314d98-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '44be9c33c7f2f96db5c0e6bd2821f8ec9ef7e8176992aead993a17d031ccc225'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.622395', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '773157fc-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.27380031, 'message_signature': '72f734e690e49d063b171096ac4dfd89d9298cf4cf7fcfdce0b24d8f96cdc047'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.622395', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7731610c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '6382d846da781580e0e3d54239d5a2c2d62e81e5f79b18cbeb84380b7b7dd700'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.622395', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77316972-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.292656786, 'message_signature': '75047404a55aa1b2c3c432aee5ef44b47ef4d3870398173f874a8f8cd6a6dc38'}]}, 'timestamp': '2025-10-04 05:32:06.623379', '_unique_id': '469963190a394a00978894cc54354eba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.623 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.624 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.624 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.624 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-675739622>, <NovaLikeServer: tempest-TestServerMultinode-server-145592455>]
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.624 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.624 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47878f46-08df-490d-b015-12fd2b243f31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.624905', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '7731af40-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': '978ccfc6a9dc21d3d3f7322416d3d8aab36f1af961d4831625b8e67664f5b6c1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.624905', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '7731b828-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': '4f0fc6ed850709a42f3334db419f60016f79bc51d4c242de287200cf74febe18'}]}, 'timestamp': '2025-10-04 05:32:06.625402', '_unique_id': '1ce966edd25142d38e0828093a1dcefd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.625 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.626 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.626 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.626 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e737164-db2f-4c8f-aa5c-b0110cadf9a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-00000002-a1baa49c-f428-4e4d-801c-abc2136158a1-tap1b5b60bd-25', 'timestamp': '2025-10-04T05:32:06.626496', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'tap1b5b60bd-25', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d1:d1:e0', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1b5b60bd-25'}, 'message_id': '7731ee2e-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.19882787, 'message_signature': '4f52a65db0c277ef307e02534305f3d63f19ca63ffdc26fd187bf76f94b7d7f9'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': 'instance-00000005-0ba50f2e-225f-45b8-9579-4a092ec91d7d-tapb0db5f91-92', 'timestamp': '2025-10-04T05:32:06.626496', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'tapb0db5f91-92', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:45:90:f1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0db5f91-92'}, 'message_id': '7731f8b0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.202523273, 'message_signature': 'f8f977254c010cbdddc47401c773be3f963f0397bd3a15f922f380c382cdff64'}]}, 'timestamp': '2025-10-04 05:32:06.627046', '_unique_id': '723d206d3959427f935562476f62f310'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.627 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.628 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.628 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.628 12 DEBUG ceilometer.compute.pollsters [-] a1baa49c-f428-4e4d-801c-abc2136158a1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.629 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.629 12 DEBUG ceilometer.compute.pollsters [-] 0ba50f2e-225f-45b8-9579-4a092ec91d7d/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aa8b2c9-ec10-4072-a7d3-a27f4ca76a82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-vda', 'timestamp': '2025-10-04T05:32:06.628298', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '7732344c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.211881612, 'message_signature': 'a686b56101031a6048c58ea8efdd6bce92280aa8a8f39bb39183a2e6a4b47612'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1-sda', 'timestamp': '2025-10-04T05:32:06.628298', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-675739622', 'name': 'instance-00000002', 'instance_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '77324784-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.211881612, 'message_signature': 'd94ab6e35fa2798af805db7a94e8101bb0e67fea222e89367dc2adc3b4102797'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-vda', 'timestamp': '2025-10-04T05:32:06.628298', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '77325008-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.225709225, 'message_signature': '981e791ba72fa0a71b9e32d40d3422bf5a118f031cb2c83ca6fd22c0f07b826b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '162446113f4b4dd5aed5e211cf8cdc28', 'user_name': None, 'project_id': 'b4e8bd7ec9b14151a805796b3de01401', 'project_name': None, 'resource_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d-sda', 'timestamp': '2025-10-04T05:32:06.628298', 'resource_metadata': {'display_name': 'tempest-TestServerMultinode-server-145592455', 'name': 'instance-00000005', 'instance_id': '0ba50f2e-225f-45b8-9579-4a092ec91d7d', 'instance_type': 'm1.nano', 'host': 'a59e9f5bf196637724d4fe0aedd911fa135861c90dead496a6d282a1', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '7732585a-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3824.225709225, 'message_signature': 'f6f267e44ae3a61024cae8d2b36aabfd67cce1553c120d09ceb8a1d560a35a83'}]}, 'timestamp': '2025-10-04 05:32:06.629496', '_unique_id': '9fac71ab26194fb688e4ed12101ed9bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:32:06 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:32:06.630 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:32:46 np0005470441 nova_compute[192626]: 2025-10-04 05:32:46.956 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759555951.9541328, 0ba50f2e-225f-45b8-9579-4a092ec91d7d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:32:46 np0005470441 nova_compute[192626]: 2025-10-04 05:32:46.957 2 INFO nova.compute.manager [-] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:32:46 np0005470441 nova_compute[192626]: 2025-10-04 05:32:46.979 2 DEBUG nova.compute.manager [None req-beeb695d-2d07-4dc9-9947-3d769051405f - - - - - -] [instance: 0ba50f2e-225f-45b8-9579-4a092ec91d7d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:32:47 np0005470441 nova_compute[192626]: 2025-10-04 05:32:47.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:47 np0005470441 rsyslogd[1005]: imjournal: 686 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Oct  4 01:32:47 np0005470441 nova_compute[192626]: 2025-10-04 05:32:47.704 2 DEBUG nova.network.neutron [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updating instance_info_cache with network_info: [{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:32:47 np0005470441 nova_compute[192626]: 2025-10-04 05:32:47.722 2 DEBUG oslo_concurrency.lockutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Releasing lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:32:47 np0005470441 nova_compute[192626]: 2025-10-04 05:32:47.814 2 DEBUG nova.virt.libvirt.driver [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  4 01:32:47 np0005470441 nova_compute[192626]: 2025-10-04 05:32:47.815 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Creating file /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/1280183f99ac4ec0b6e015f99d23761b.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  4 01:32:47 np0005470441 nova_compute[192626]: 2025-10-04 05:32:47.815 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/1280183f99ac4ec0b6e015f99d23761b.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:47.885 103796 DEBUG eventlet.wsgi.server [-] (103796) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:47.886 103796 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: Accept: */*#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: Connection: close#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: Content-Type: text/plain#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: Host: 169.254.169.254#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: User-Agent: curl/7.84.0#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: X-Forwarded-For: 10.100.0.14#015
Oct  4 01:32:47 np0005470441 ovn_metadata_agent[103684]: X-Ovn-Network-Id: 16b6e76b-2352-428b-8cf4-911c7127c998 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  4 01:32:48 np0005470441 nova_compute[192626]: 2025-10-04 05:32:48.252 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/1280183f99ac4ec0b6e015f99d23761b.tmp" returned: 1 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:48 np0005470441 nova_compute[192626]: 2025-10-04 05:32:48.253 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/1280183f99ac4ec0b6e015f99d23761b.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  4 01:32:48 np0005470441 nova_compute[192626]: 2025-10-04 05:32:48.253 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Creating directory /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  4 01:32:48 np0005470441 nova_compute[192626]: 2025-10-04 05:32:48.253 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:48 np0005470441 nova_compute[192626]: 2025-10-04 05:32:48.454 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:48 np0005470441 nova_compute[192626]: 2025-10-04 05:32:48.459 2 DEBUG nova.virt.libvirt.driver [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  4 01:32:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:48Z|00053|binding|INFO|Claiming lport b0eb2882-c375-490a-9308-11da20a838e8 for this chassis.
Oct  4 01:32:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:48Z|00054|binding|INFO|b0eb2882-c375-490a-9308-11da20a838e8: Claiming fa:16:3e:b3:d8:fc 10.100.0.7
Oct  4 01:32:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:48Z|00055|binding|INFO|Setting lport b0eb2882-c375-490a-9308-11da20a838e8 up in Southbound
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.907 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:d8:fc 10.100.0.7'], port_security=['fa:16:3e:b3:d8:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a984030f-c569-4bd0-83e0-9a6812d06f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '11', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfeb7d9d-6193-40b2-b586-fa0e6ac8f060, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=b0eb2882-c375-490a-9308-11da20a838e8) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.909 103689 INFO neutron.agent.ovn.metadata.agent [-] Port b0eb2882-c375-490a-9308-11da20a838e8 in datapath a984030f-c569-4bd0-83e0-9a6812d06f48 bound to our chassis#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.912 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a984030f-c569-4bd0-83e0-9a6812d06f48#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.923 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[47162fd2-1f89-4050-94c5-ebec319a6a36]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.925 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa984030f-c1 in ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.926 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa984030f-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.926 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[94f6a7d1-2006-43c2-a0ca-7406d28b1172]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.927 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[065b661b-ccad-430c-893f-0b78e20d5327]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.940 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[aab8fd60-32e7-4d5a-9848-d0026e2d2e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.969 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4058fa63-5339-41e7-b1b9-9e2f8947bfdd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:48.999 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[9e283494-2d19-4f0d-9369-41430004969b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 NetworkManager[51690]: <info>  [1759555969.0060] manager: (tapa984030f-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.005 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6abd49ea-62d5-4909-9173-628361dee532]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 systemd-udevd[221309]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.037 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[6234955e-e3d8-4913-a883-4a8f3e18e247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.043 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1427c1e5-c3e2-44cd-93f3-af7838884402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 NetworkManager[51690]: <info>  [1759555969.0691] device (tapa984030f-c0): carrier: link connected
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.074 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae531b3-e7f1-4060-8eb9-1c27c6313a00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.090 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[085ffa4e-0a89-4775-a498-5e003da3b1eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa984030f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:44:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386672, 'reachable_time': 33947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221328, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.104 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc8d604-3748-4f5d-86d2-f5808c81074d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:4491'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386672, 'tstamp': 386672}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221329, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.120 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[464fd62e-0a4e-4b4a-80cd-0f914b81d8d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa984030f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:44:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386672, 'reachable_time': 33947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221330, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.147 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cc378b69-4eb7-4276-a7f2-cd1765b9e078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.202 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[31bb93b9-60af-43b8-8688-db55adbc0818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.203 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa984030f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.203 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.204 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa984030f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:49 np0005470441 NetworkManager[51690]: <info>  [1759555969.2063] manager: (tapa984030f-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct  4 01:32:49 np0005470441 kernel: tapa984030f-c0: entered promiscuous mode
Oct  4 01:32:49 np0005470441 nova_compute[192626]: 2025-10-04 05:32:49.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.208 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa984030f-c0, col_values=(('external_ids', {'iface-id': '3ea6f406-5ff7-4b46-9301-f23ee9be4b86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:49 np0005470441 nova_compute[192626]: 2025-10-04 05:32:49.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:49 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:49Z|00056|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:32:49 np0005470441 nova_compute[192626]: 2025-10-04 05:32:49.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.223 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a984030f-c569-4bd0-83e0-9a6812d06f48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a984030f-c569-4bd0-83e0-9a6812d06f48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.224 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4af2fb13-8160-4111-b789-84af68038dca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.224 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-a984030f-c569-4bd0-83e0-9a6812d06f48
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/a984030f-c569-4bd0-83e0-9a6812d06f48.pid.haproxy
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID a984030f-c569-4bd0-83e0-9a6812d06f48
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:32:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:49.225 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'env', 'PROCESS_TAG=haproxy-a984030f-c569-4bd0-83e0-9a6812d06f48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a984030f-c569-4bd0-83e0-9a6812d06f48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:32:49 np0005470441 nova_compute[192626]: 2025-10-04 05:32:49.568 2 INFO nova.compute.manager [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Post operation of migration started#033[00m
Oct  4 01:32:49 np0005470441 nova_compute[192626]: 2025-10-04 05:32:49.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:49 np0005470441 podman[221361]: 2025-10-04 05:32:49.592400863 +0000 UTC m=+0.068330604 container create 4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:32:49 np0005470441 systemd[1]: Started libpod-conmon-4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4.scope.
Oct  4 01:32:49 np0005470441 podman[221361]: 2025-10-04 05:32:49.549551643 +0000 UTC m=+0.025481404 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:32:49 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:32:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/95d891736d1b9070e752cbe59dd0bc48203c1aea23c924d741850adfa30c7d15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:32:49 np0005470441 podman[221361]: 2025-10-04 05:32:49.672736634 +0000 UTC m=+0.148666405 container init 4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  4 01:32:49 np0005470441 podman[221361]: 2025-10-04 05:32:49.679753011 +0000 UTC m=+0.155682752 container start 4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:32:49 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [NOTICE]   (221400) : New worker (221402) forked
Oct  4 01:32:49 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [NOTICE]   (221400) : Loading success.
Oct  4 01:32:49 np0005470441 podman[221374]: 2025-10-04 05:32:49.713021951 +0000 UTC m=+0.083495007 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container)
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.160 103796 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.160 103796 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 2.2743859#033[00m
Oct  4 01:32:50 np0005470441 haproxy-metadata-proxy-16b6e76b-2352-428b-8cf4-911c7127c998[220913]: 10.100.0.14:35214 [04/Oct/2025:05:32:47.883] listener listener/metadata 0/0/0/2276/2276 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.242 103796 DEBUG eventlet.wsgi.server [-] (103796) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.243 103796 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: Accept: */*#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: Connection: close#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: Content-Length: 100#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: Content-Type: application/x-www-form-urlencoded#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: Host: 169.254.169.254#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: User-Agent: curl/7.84.0#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: X-Forwarded-For: 10.100.0.14#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: X-Ovn-Network-Id: 16b6e76b-2352-428b-8cf4-911c7127c998#015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: #015
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.337 2 DEBUG oslo_concurrency.lockutils [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "refresh_cache-b89756b5-b481-4ad9-aaf8-afda62b5d1bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.338 2 DEBUG oslo_concurrency.lockutils [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquired lock "refresh_cache-b89756b5-b481-4ad9-aaf8-afda62b5d1bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.338 2 DEBUG nova.network.neutron [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.522 103796 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.523 103796 INFO eventlet.wsgi.server [-] 10.100.0.14,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.2803411#033[00m
Oct  4 01:32:50 np0005470441 haproxy-metadata-proxy-16b6e76b-2352-428b-8cf4-911c7127c998[220913]: 10.100.0.14:35228 [04/Oct/2025:05:32:50.241] listener listener/metadata 0/0/0/282/282 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Oct  4 01:32:50 np0005470441 kernel: tap1b5b60bd-25 (unregistering): left promiscuous mode
Oct  4 01:32:50 np0005470441 NetworkManager[51690]: <info>  [1759555970.5989] device (tap1b5b60bd-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:32:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:50Z|00057|binding|INFO|Releasing lport 1b5b60bd-2531-4381-84cd-eb569ec9274c from this chassis (sb_readonly=0)
Oct  4 01:32:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:50Z|00058|binding|INFO|Setting lport 1b5b60bd-2531-4381-84cd-eb569ec9274c down in Southbound
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:50Z|00059|binding|INFO|Removing iface tap1b5b60bd-25 ovn-installed in OVS
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.622 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:d1:e0 10.100.0.8'], port_security=['fa:16:3e:d1:d1:e0 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a1baa49c-f428-4e4d-801c-abc2136158a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ce69795d-c8bb-4412-99cd-26423ff2a719', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93ea15ef-b67e-4d79-b2d8-fce7c9643ed0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1b5b60bd-2531-4381-84cd-eb569ec9274c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.624 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1b5b60bd-2531-4381-84cd-eb569ec9274c in datapath 1672dacf-b95d-4a80-9b7d-b30bde70ba8b unbound from our chassis#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.626 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1672dacf-b95d-4a80-9b7d-b30bde70ba8b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.627 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[412b9ff5-0220-474e-992b-e83846e4b15e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.627 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b namespace which is not needed anymore#033[00m
Oct  4 01:32:50 np0005470441 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Oct  4 01:32:50 np0005470441 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 13.627s CPU time.
Oct  4 01:32:50 np0005470441 systemd-machined[152624]: Machine qemu-1-instance-00000002 terminated.
Oct  4 01:32:50 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [NOTICE]   (220445) : haproxy version is 2.8.14-c23fe91
Oct  4 01:32:50 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [NOTICE]   (220445) : path to executable is /usr/sbin/haproxy
Oct  4 01:32:50 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [WARNING]  (220445) : Exiting Master process...
Oct  4 01:32:50 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [WARNING]  (220445) : Exiting Master process...
Oct  4 01:32:50 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [ALERT]    (220445) : Current worker (220447) exited with code 143 (Terminated)
Oct  4 01:32:50 np0005470441 neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b[220441]: [WARNING]  (220445) : All workers exited. Exiting... (0)
Oct  4 01:32:50 np0005470441 systemd[1]: libpod-0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250.scope: Deactivated successfully.
Oct  4 01:32:50 np0005470441 podman[221432]: 2025-10-04 05:32:50.756776048 +0000 UTC m=+0.046021878 container died 0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:32:50 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250-userdata-shm.mount: Deactivated successfully.
Oct  4 01:32:50 np0005470441 systemd[1]: var-lib-containers-storage-overlay-4f0ea3a409dc5053f3eddaf49f10ec20345692723951380c04c29e58c3961ce3-merged.mount: Deactivated successfully.
Oct  4 01:32:50 np0005470441 podman[221432]: 2025-10-04 05:32:50.796675494 +0000 UTC m=+0.085921324 container cleanup 0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:32:50 np0005470441 systemd[1]: libpod-conmon-0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250.scope: Deactivated successfully.
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 podman[221464]: 2025-10-04 05:32:50.861897885 +0000 UTC m=+0.044746841 container remove 0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.872 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7abf6115-a0b5-44a1-9502-aa3ef093238a]: (4, ('Sat Oct  4 05:32:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b (0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250)\n0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250\nSat Oct  4 05:32:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b (0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250)\n0e6160dc507bfbf94d47d10ab3a6511c2f221a7fbb9881b6af4a2b5ef80af250\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.874 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9437184c-1833-4b67-9d9c-9a557b8edc0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.876 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1672dacf-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 kernel: tap1672dacf-b0: left promiscuous mode
Oct  4 01:32:50 np0005470441 nova_compute[192626]: 2025-10-04 05:32:50.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.900 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[de4f541d-ba24-4ecd-97f9-61b75d5f3140]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.933 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[50f50a8d-d669-4cc6-8104-e0bd456ced99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.934 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4360f0e5-c541-43fc-8042-63e3812eafef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.949 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[68b35d22-46d5-4bf0-8264-6eaa0a797569]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380214, 'reachable_time': 25068, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221496, 'error': None, 'target': 'ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.952 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1672dacf-b95d-4a80-9b7d-b30bde70ba8b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:32:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:50.952 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[877771d4-5607-4ee0-80a2-43558580a772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:50 np0005470441 systemd[1]: run-netns-ovnmeta\x2d1672dacf\x2db95d\x2d4a80\x2d9b7d\x2db30bde70ba8b.mount: Deactivated successfully.
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.474 2 INFO nova.virt.libvirt.driver [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Instance shutdown successfully after 3 seconds.#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.479 2 INFO nova.virt.libvirt.driver [-] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Instance destroyed successfully.#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.481 2 DEBUG nova.virt.libvirt.vif [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-675739622',display_name='tempest-TestNetworkAdvancedServerOps-server-675739622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-675739622',id=2,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP5+CRSM3/W/3cdhXTvJjWK5UqBbkM4aujgf+ON1jBkOSGjcEuxVD5W29TRWk+OUAt6wyZdunDlHRBm9PNDcqsoaG2HxeOcc3JYO5cd3/bCy4UrUPgcVg69owsQ+Gy2tQQ==',key_name='tempest-TestNetworkAdvancedServerOps-2087633968',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:32:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-vf0w3tlq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:32:44Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=a1baa49c-f428-4e4d-801c-abc2136158a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1651115047", "vif_mac": "fa:16:3e:d1:d1:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.481 2 DEBUG nova.network.os_vif_util [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Converting VIF {"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1651115047", "vif_mac": "fa:16:3e:d1:d1:e0"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.482 2 DEBUG nova.network.os_vif_util [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.483 2 DEBUG os_vif [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.486 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b5b60bd-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.491 2 INFO os_vif [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25')#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.494 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.551 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.552 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.606 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.607 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Copying file /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk to 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  4 01:32:51 np0005470441 nova_compute[192626]: 2025-10-04 05:32:51.608 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.213 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "scp -r /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.214 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Copying file /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.214 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk.config 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:52 np0005470441 podman[221507]: 2025-10-04 05:32:52.302404348 +0000 UTC m=+0.051621610 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.435 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.436 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.437 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.437 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.438 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.440 2 INFO nova.compute.manager [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Terminating instance#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.442 2 DEBUG nova.compute.manager [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.446 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "scp -C -r /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk.config 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.config" returned: 0 in 0.232s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.450 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Copying file /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.450 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk.info 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:52 np0005470441 kernel: tap61f3e22e-81 (unregistering): left promiscuous mode
Oct  4 01:32:52 np0005470441 NetworkManager[51690]: <info>  [1759555972.4650] device (tap61f3e22e-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00060|binding|INFO|Releasing lport 61f3e22e-81bb-4257-ae8b-6f87296172ed from this chassis (sb_readonly=0)
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00061|binding|INFO|Setting lport 61f3e22e-81bb-4257-ae8b-6f87296172ed down in Southbound
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00062|binding|INFO|Removing iface tap61f3e22e-81 ovn-installed in OVS
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.490 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:0b:80 10.100.0.14'], port_security=['fa:16:3e:cf:0b:80 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b6e76b-2352-428b-8cf4-911c7127c998', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd524d98814ee446a9dd1280968a7744c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd96f2665-9d66-43ac-b460-7337f161a2d7 faf91cdd-7d56-42d1-a136-2ee8cb42f5c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19d78b1d-6df5-4206-af1e-f4a6f06bbd45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=61f3e22e-81bb-4257-ae8b-6f87296172ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.492 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 61f3e22e-81bb-4257-ae8b-6f87296172ed in datapath 16b6e76b-2352-428b-8cf4-911c7127c998 unbound from our chassis#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.494 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16b6e76b-2352-428b-8cf4-911c7127c998, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.495 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4d1851-9a13-4637-ad2d-2239366d0970]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.496 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998 namespace which is not needed anymore#033[00m
Oct  4 01:32:52 np0005470441 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct  4 01:32:52 np0005470441 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 13.499s CPU time.
Oct  4 01:32:52 np0005470441 systemd-machined[152624]: Machine qemu-3-instance-00000008 terminated.
Oct  4 01:32:52 np0005470441 neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998[220907]: [NOTICE]   (220911) : haproxy version is 2.8.14-c23fe91
Oct  4 01:32:52 np0005470441 neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998[220907]: [NOTICE]   (220911) : path to executable is /usr/sbin/haproxy
Oct  4 01:32:52 np0005470441 neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998[220907]: [WARNING]  (220911) : Exiting Master process...
Oct  4 01:32:52 np0005470441 neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998[220907]: [ALERT]    (220911) : Current worker (220913) exited with code 143 (Terminated)
Oct  4 01:32:52 np0005470441 neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998[220907]: [WARNING]  (220911) : All workers exited. Exiting... (0)
Oct  4 01:32:52 np0005470441 systemd[1]: libpod-21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068.scope: Deactivated successfully.
Oct  4 01:32:52 np0005470441 podman[221556]: 2025-10-04 05:32:52.609803302 +0000 UTC m=+0.040101258 container died 21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:32:52 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068-userdata-shm.mount: Deactivated successfully.
Oct  4 01:32:52 np0005470441 systemd[1]: var-lib-containers-storage-overlay-abb80cd23872e79f4a1983fe08af32941d91a1499ec4905d45d41028d5092bd2-merged.mount: Deactivated successfully.
Oct  4 01:32:52 np0005470441 podman[221556]: 2025-10-04 05:32:52.639124295 +0000 UTC m=+0.069422211 container cleanup 21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS)
Oct  4 01:32:52 np0005470441 systemd[1]: libpod-conmon-21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068.scope: Deactivated successfully.
Oct  4 01:32:52 np0005470441 kernel: tap61f3e22e-81: entered promiscuous mode
Oct  4 01:32:52 np0005470441 systemd-udevd[221537]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:32:52 np0005470441 kernel: tap61f3e22e-81 (unregistering): left promiscuous mode
Oct  4 01:32:52 np0005470441 NetworkManager[51690]: <info>  [1759555972.6617] manager: (tap61f3e22e-81): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00063|binding|INFO|Claiming lport 61f3e22e-81bb-4257-ae8b-6f87296172ed for this chassis.
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00064|binding|INFO|61f3e22e-81bb-4257-ae8b-6f87296172ed: Claiming fa:16:3e:cf:0b:80 10.100.0.14
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.670 2 DEBUG oslo_concurrency.processutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] CMD "scp -C -r /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1_resize/disk.info 192.168.122.100:/var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk.info" returned: 0 in 0.220s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.678 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:0b:80 10.100.0.14'], port_security=['fa:16:3e:cf:0b:80 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b6e76b-2352-428b-8cf4-911c7127c998', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd524d98814ee446a9dd1280968a7744c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd96f2665-9d66-43ac-b460-7337f161a2d7 faf91cdd-7d56-42d1-a136-2ee8cb42f5c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19d78b1d-6df5-4206-af1e-f4a6f06bbd45, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=61f3e22e-81bb-4257-ae8b-6f87296172ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00065|binding|INFO|Setting lport 61f3e22e-81bb-4257-ae8b-6f87296172ed ovn-installed in OVS
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00066|binding|INFO|Setting lport 61f3e22e-81bb-4257-ae8b-6f87296172ed up in Southbound
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00067|binding|INFO|Releasing lport 61f3e22e-81bb-4257-ae8b-6f87296172ed from this chassis (sb_readonly=1)
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00068|if_status|INFO|Not setting lport 61f3e22e-81bb-4257-ae8b-6f87296172ed down as sb is readonly
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00069|binding|INFO|Removing iface tap61f3e22e-81 ovn-installed in OVS
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00070|binding|INFO|Releasing lport 61f3e22e-81bb-4257-ae8b-6f87296172ed from this chassis (sb_readonly=0)
Oct  4 01:32:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:52Z|00071|binding|INFO|Setting lport 61f3e22e-81bb-4257-ae8b-6f87296172ed down in Southbound
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.702 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cf:0b:80 10.100.0.14'], port_security=['fa:16:3e:cf:0b:80 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16b6e76b-2352-428b-8cf4-911c7127c998', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd524d98814ee446a9dd1280968a7744c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd96f2665-9d66-43ac-b460-7337f161a2d7 faf91cdd-7d56-42d1-a136-2ee8cb42f5c9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.198'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19d78b1d-6df5-4206-af1e-f4a6f06bbd45, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=61f3e22e-81bb-4257-ae8b-6f87296172ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.710 2 INFO nova.virt.libvirt.driver [-] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Instance destroyed successfully.#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.711 2 DEBUG nova.objects.instance [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lazy-loading 'resources' on Instance uuid b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:32:52 np0005470441 podman[221585]: 2025-10-04 05:32:52.714050303 +0000 UTC m=+0.056326357 container remove 21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.718 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e6a0cf-1b20-418e-96fc-6187d7750296]: (4, ('Sat Oct  4 05:32:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998 (21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068)\n21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068\nSat Oct  4 05:32:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998 (21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068)\n21a3a4c8abe2767ada1ef5861012502d413c610a700d9aafb2fbd36604d95068\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.719 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[21dc4c8c-5659-427c-9aba-e2d94870f8f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.719 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16b6e76b-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 kernel: tap16b6e76b-20: left promiscuous mode
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.725 2 DEBUG nova.virt.libvirt.vif [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:32:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-759946528',display_name='tempest-TestServerBasicOps-server-759946528',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-759946528',id=8,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG6N18PtyTRG8d3415Yf08SZBuCIUBQuWZUJqsc0CfgVeo/0LoqRHSFJK3y2JKc2B1GlJM7kmFIMdzEzdkobadJ+5tGhYioVo9/T2BzEFXJa5LafXWjqXgIvEylvAyQxGw==',key_name='tempest-TestServerBasicOps-570152522',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:32:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d524d98814ee446a9dd1280968a7744c',ramdisk_id='',reservation_id='r-ii0kmtc3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-3429004',owner_user_name='tempest-TestServerBasicOps-3429004-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:32:50Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='4d92f636431b4823991511cd92b89378',uuid=b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "61f3e22e-81bb-4257-ae8b-6f87296172ed", "address": "fa:16:3e:cf:0b:80", "network": {"id": "16b6e76b-2352-428b-8cf4-911c7127c998", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1533470400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d524d98814ee446a9dd1280968a7744c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f3e22e-81", "ovs_interfaceid": "61f3e22e-81bb-4257-ae8b-6f87296172ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.725 2 DEBUG nova.network.os_vif_util [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Converting VIF {"id": "61f3e22e-81bb-4257-ae8b-6f87296172ed", "address": "fa:16:3e:cf:0b:80", "network": {"id": "16b6e76b-2352-428b-8cf4-911c7127c998", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1533470400-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d524d98814ee446a9dd1280968a7744c", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap61f3e22e-81", "ovs_interfaceid": "61f3e22e-81bb-4257-ae8b-6f87296172ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.726 2 DEBUG nova.network.os_vif_util [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cf:0b:80,bridge_name='br-int',has_traffic_filtering=True,id=61f3e22e-81bb-4257-ae8b-6f87296172ed,network=Network(16b6e76b-2352-428b-8cf4-911c7127c998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61f3e22e-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.726 2 DEBUG os_vif [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:0b:80,bridge_name='br-int',has_traffic_filtering=True,id=61f3e22e-81bb-4257-ae8b-6f87296172ed,network=Network(16b6e76b-2352-428b-8cf4-911c7127c998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61f3e22e-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61f3e22e-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.738 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[916bbeff-02cd-419a-80cd-1f40f23bc9e4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.739 2 INFO os_vif [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cf:0b:80,bridge_name='br-int',has_traffic_filtering=True,id=61f3e22e-81bb-4257-ae8b-6f87296172ed,network=Network(16b6e76b-2352-428b-8cf4-911c7127c998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap61f3e22e-81')#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.739 2 INFO nova.virt.libvirt.driver [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Deleting instance files /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951_del#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.740 2 INFO nova.virt.libvirt.driver [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Deletion of /var/lib/nova/instances/b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951_del complete#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.761 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4659b620-6766-478d-9601-7d7382718347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.762 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e60af20a-737b-4630-b1df-312cb626f383]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.774 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[22fe4f81-01a6-4706-9b3c-12affe1f55e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 383432, 'reachable_time': 34012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221614, 'error': None, 'target': 'ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.776 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16b6e76b-2352-428b-8cf4-911c7127c998 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.776 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[4aacd6ff-c34a-4fe1-996b-bd096ffd84b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.776 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 61f3e22e-81bb-4257-ae8b-6f87296172ed in datapath 16b6e76b-2352-428b-8cf4-911c7127c998 unbound from our chassis#033[00m
Oct  4 01:32:52 np0005470441 systemd[1]: run-netns-ovnmeta\x2d16b6e76b\x2d2352\x2d428b\x2d8cf4\x2d911c7127c998.mount: Deactivated successfully.
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.778 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16b6e76b-2352-428b-8cf4-911c7127c998, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.779 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[49712a63-7f11-458f-9b40-333d42622223]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.779 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 61f3e22e-81bb-4257-ae8b-6f87296172ed in datapath 16b6e76b-2352-428b-8cf4-911c7127c998 unbound from our chassis#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.781 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16b6e76b-2352-428b-8cf4-911c7127c998, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:32:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:32:52.781 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[387b446e-3f7c-4d6a-acf0-60379f04ac80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.793 2 INFO nova.compute.manager [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.794 2 DEBUG oslo.service.loopingcall [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.794 2 DEBUG nova.compute.manager [-] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:32:52 np0005470441 nova_compute[192626]: 2025-10-04 05:32:52.794 2 DEBUG nova.network.neutron [-] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:32:53 np0005470441 nova_compute[192626]: 2025-10-04 05:32:53.418 2 DEBUG neutronclient.v2_0.client [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 1b5b60bd-2531-4381-84cd-eb569ec9274c for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  4 01:32:53 np0005470441 nova_compute[192626]: 2025-10-04 05:32:53.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:53 np0005470441 nova_compute[192626]: 2025-10-04 05:32:53.760 2 DEBUG oslo_concurrency.lockutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:53 np0005470441 nova_compute[192626]: 2025-10-04 05:32:53.761 2 DEBUG oslo_concurrency.lockutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:53 np0005470441 nova_compute[192626]: 2025-10-04 05:32:53.761 2 DEBUG oslo_concurrency.lockutils [None req-286aec57-e642-4ee5-a347-8d703fb27cbb 34888105fea04271b8af8dd42dc90bed 647a5559d16149d4b3bd32af048c16ba - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.718 2 DEBUG nova.network.neutron [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Updating instance_info_cache with network_info: [{"id": "b0eb2882-c375-490a-9308-11da20a838e8", "address": "fa:16:3e:b3:d8:fc", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0eb2882-c3", "ovs_interfaceid": "b0eb2882-c375-490a-9308-11da20a838e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.719 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.719 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.741 2 DEBUG oslo_concurrency.lockutils [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Releasing lock "refresh_cache-b89756b5-b481-4ad9-aaf8-afda62b5d1bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.757 2 DEBUG oslo_concurrency.lockutils [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.757 2 DEBUG oslo_concurrency.lockutils [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.758 2 DEBUG oslo_concurrency.lockutils [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.762 2 INFO nova.virt.libvirt.driver [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  4 01:32:54 np0005470441 virtqemud[192168]: Domain id=4 name='instance-00000003' uuid=b89756b5-b481-4ad9-aaf8-afda62b5d1bc is tainted: custom-monitor
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.804 2 DEBUG nova.compute.manager [req-0f62db9e-aaf8-482c-9a11-250f7c3505d6 req-5d3a5abb-a654-44da-b31c-e37133aa771e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Received event network-vif-unplugged-61f3e22e-81bb-4257-ae8b-6f87296172ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.805 2 DEBUG oslo_concurrency.lockutils [req-0f62db9e-aaf8-482c-9a11-250f7c3505d6 req-5d3a5abb-a654-44da-b31c-e37133aa771e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.805 2 DEBUG oslo_concurrency.lockutils [req-0f62db9e-aaf8-482c-9a11-250f7c3505d6 req-5d3a5abb-a654-44da-b31c-e37133aa771e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.805 2 DEBUG oslo_concurrency.lockutils [req-0f62db9e-aaf8-482c-9a11-250f7c3505d6 req-5d3a5abb-a654-44da-b31c-e37133aa771e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.805 2 DEBUG nova.compute.manager [req-0f62db9e-aaf8-482c-9a11-250f7c3505d6 req-5d3a5abb-a654-44da-b31c-e37133aa771e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] No waiting events found dispatching network-vif-unplugged-61f3e22e-81bb-4257-ae8b-6f87296172ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.806 2 DEBUG nova.compute.manager [req-0f62db9e-aaf8-482c-9a11-250f7c3505d6 req-5d3a5abb-a654-44da-b31c-e37133aa771e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Received event network-vif-unplugged-61f3e22e-81bb-4257-ae8b-6f87296172ed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.889 2 DEBUG nova.compute.manager [req-a67dbc46-39ea-4697-994c-440e6288163a req-0c7aa320-371f-4ca1-ad22-cad090217949 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received event network-vif-unplugged-1b5b60bd-2531-4381-84cd-eb569ec9274c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.890 2 DEBUG oslo_concurrency.lockutils [req-a67dbc46-39ea-4697-994c-440e6288163a req-0c7aa320-371f-4ca1-ad22-cad090217949 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.890 2 DEBUG oslo_concurrency.lockutils [req-a67dbc46-39ea-4697-994c-440e6288163a req-0c7aa320-371f-4ca1-ad22-cad090217949 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.890 2 DEBUG oslo_concurrency.lockutils [req-a67dbc46-39ea-4697-994c-440e6288163a req-0c7aa320-371f-4ca1-ad22-cad090217949 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.890 2 DEBUG nova.compute.manager [req-a67dbc46-39ea-4697-994c-440e6288163a req-0c7aa320-371f-4ca1-ad22-cad090217949 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] No waiting events found dispatching network-vif-unplugged-1b5b60bd-2531-4381-84cd-eb569ec9274c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:32:54 np0005470441 nova_compute[192626]: 2025-10-04 05:32:54.891 2 WARNING nova.compute.manager [req-a67dbc46-39ea-4697-994c-440e6288163a req-0c7aa320-371f-4ca1-ad22-cad090217949 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received unexpected event network-vif-unplugged-1b5b60bd-2531-4381-84cd-eb569ec9274c for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.225 2 DEBUG nova.network.neutron [-] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.256 2 INFO nova.compute.manager [-] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Took 2.46 seconds to deallocate network for instance.#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.307 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.308 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.428 2 DEBUG nova.compute.provider_tree [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.446 2 DEBUG nova.scheduler.client.report [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.472 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.500 2 INFO nova.scheduler.client.report [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Deleted allocations for instance b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.607 2 DEBUG oslo_concurrency.lockutils [None req-e9f8bad8-92f6-4950-a3a1-d7813ccb1f6d 4d92f636431b4823991511cd92b89378 d524d98814ee446a9dd1280968a7744c - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.714 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.715 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.715 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.732 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.758 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.758 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.758 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.758 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.769 2 INFO nova.virt.libvirt.driver [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.831 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000002, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/a1baa49c-f428-4e4d-801c-abc2136158a1/disk#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.834 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:55 np0005470441 podman[221616]: 2025-10-04 05:32:55.884861152 +0000 UTC m=+0.077925466 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.897 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.898 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:32:55 np0005470441 nova_compute[192626]: 2025-10-04 05:32:55.956 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.095 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.097 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5546MB free_disk=73.40969467163086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.097 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.097 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.151 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Migration for instance a1baa49c-f428-4e4d-801c-abc2136158a1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.152 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Migration for instance b89756b5-b481-4ad9-aaf8-afda62b5d1bc refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.213 2 INFO nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updating resource usage from migration 99e4cd77-1340-4221-aff0-a61c9a0f4eaa#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.213 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Starting to track outgoing migration 99e4cd77-1340-4221-aff0-a61c9a0f4eaa with flavor 9585bc8c-c7a8-4928-b67c-bb6035012f8e _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.214 2 INFO nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Updating resource usage from migration 14b66b21-4fb8-450f-a823-1a3907d39051#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.214 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Starting to track incoming migration 14b66b21-4fb8-450f-a823-1a3907d39051 with flavor 9585bc8c-c7a8-4928-b67c-bb6035012f8e _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.284 2 WARNING nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance b89756b5-b481-4ad9-aaf8-afda62b5d1bc has been moved to another host compute-2.ctlplane.example.com(compute-2.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}.#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.285 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Migration 99e4cd77-1340-4221-aff0-a61c9a0f4eaa is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.285 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.285 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.348 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.366 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.394 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.395 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:56 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:56Z|00072|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.774 2 INFO nova.virt.libvirt.driver [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.780 2 DEBUG nova.compute.manager [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.804 2 DEBUG nova.objects.instance [None req-56a6fca9-f9ce-4dd8-9e86-795f66e761f6 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.969 2 DEBUG nova.compute.manager [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Received event network-vif-plugged-61f3e22e-81bb-4257-ae8b-6f87296172ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.969 2 DEBUG oslo_concurrency.lockutils [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.970 2 DEBUG oslo_concurrency.lockutils [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.970 2 DEBUG oslo_concurrency.lockutils [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.970 2 DEBUG nova.compute.manager [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] No waiting events found dispatching network-vif-plugged-61f3e22e-81bb-4257-ae8b-6f87296172ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.970 2 WARNING nova.compute.manager [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Received unexpected event network-vif-plugged-61f3e22e-81bb-4257-ae8b-6f87296172ed for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:32:56 np0005470441 nova_compute[192626]: 2025-10-04 05:32:56.971 2 DEBUG nova.compute.manager [req-53b4736e-8e05-46cf-8ab4-94d18f394735 req-ffa55c4e-20f6-4ca5-a846-c50cac64c8dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Received event network-vif-deleted-61f3e22e-81bb-4257-ae8b-6f87296172ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:32:57Z|00073|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.378 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:32:57 np0005470441 podman[221643]: 2025-10-04 05:32:57.816668324 +0000 UTC m=+0.078425152 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.824 2 DEBUG nova.compute.manager [req-63026021-b41a-40cb-b588-f672d75e74bf req-7a5334a4-5a90-40d5-b0c9-7010c4d4792e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.825 2 DEBUG oslo_concurrency.lockutils [req-63026021-b41a-40cb-b588-f672d75e74bf req-7a5334a4-5a90-40d5-b0c9-7010c4d4792e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.825 2 DEBUG oslo_concurrency.lockutils [req-63026021-b41a-40cb-b588-f672d75e74bf req-7a5334a4-5a90-40d5-b0c9-7010c4d4792e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.825 2 DEBUG oslo_concurrency.lockutils [req-63026021-b41a-40cb-b588-f672d75e74bf req-7a5334a4-5a90-40d5-b0c9-7010c4d4792e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.825 2 DEBUG nova.compute.manager [req-63026021-b41a-40cb-b588-f672d75e74bf req-7a5334a4-5a90-40d5-b0c9-7010c4d4792e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] No waiting events found dispatching network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:32:57 np0005470441 nova_compute[192626]: 2025-10-04 05:32:57.825 2 WARNING nova.compute.manager [req-63026021-b41a-40cb-b588-f672d75e74bf req-7a5334a4-5a90-40d5-b0c9-7010c4d4792e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received unexpected event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  4 01:32:59 np0005470441 nova_compute[192626]: 2025-10-04 05:32:59.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:33:00 np0005470441 nova_compute[192626]: 2025-10-04 05:33:00.012 2 DEBUG nova.compute.manager [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received event network-changed-1b5b60bd-2531-4381-84cd-eb569ec9274c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:00 np0005470441 nova_compute[192626]: 2025-10-04 05:33:00.012 2 DEBUG nova.compute.manager [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Refreshing instance network info cache due to event network-changed-1b5b60bd-2531-4381-84cd-eb569ec9274c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:33:00 np0005470441 nova_compute[192626]: 2025-10-04 05:33:00.013 2 DEBUG oslo_concurrency.lockutils [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:00 np0005470441 nova_compute[192626]: 2025-10-04 05:33:00.013 2 DEBUG oslo_concurrency.lockutils [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:00 np0005470441 nova_compute[192626]: 2025-10-04 05:33:00.013 2 DEBUG nova.network.neutron [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Refreshing network info cache for port 1b5b60bd-2531-4381-84cd-eb569ec9274c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:33:02 np0005470441 nova_compute[192626]: 2025-10-04 05:33:02.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:02 np0005470441 nova_compute[192626]: 2025-10-04 05:33:02.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:03 np0005470441 nova_compute[192626]: 2025-10-04 05:33:03.527 2 DEBUG nova.network.neutron [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updated VIF entry in instance network info cache for port 1b5b60bd-2531-4381-84cd-eb569ec9274c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:33:03 np0005470441 nova_compute[192626]: 2025-10-04 05:33:03.527 2 DEBUG nova.network.neutron [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updating instance_info_cache with network_info: [{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:03 np0005470441 nova_compute[192626]: 2025-10-04 05:33:03.574 2 DEBUG oslo_concurrency.lockutils [req-8fab22df-1f13-4924-81cd-9d4920062de6 req-f143f458-8929-4173-8829-75d120846da3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:05 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:05.620 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:33:05 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:05.621 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.868 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759555970.8677256, a1baa49c-f428-4e4d-801c-abc2136158a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.869 2 INFO nova.compute.manager [-] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.892 2 DEBUG nova.compute.manager [None req-d70003b9-0c75-443d-b410-506d459eb819 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.896 2 DEBUG nova.compute.manager [None req-d70003b9-0c75-443d-b410-506d459eb819 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.915 2 INFO nova.compute.manager [None req-d70003b9-0c75-443d-b410-506d459eb819 - - - - - -] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.976 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.976 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:05 np0005470441 nova_compute[192626]: 2025-10-04 05:33:05.976 2 DEBUG nova.compute.manager [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Going to confirm migration 2 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  4 01:33:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:06.737 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:06.738 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:06.739 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:07 np0005470441 podman[221673]: 2025-10-04 05:33:07.307447261 +0000 UTC m=+0.049671638 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:33:07 np0005470441 podman[221672]: 2025-10-04 05:33:07.307481813 +0000 UTC m=+0.051873054 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.437 2 DEBUG nova.compute.manager [req-fdc66bc6-9826-4a53-a18d-a2f983a0c941 req-f10ee001-58cc-4a8c-9b65-89bfd401f91b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.438 2 DEBUG oslo_concurrency.lockutils [req-fdc66bc6-9826-4a53-a18d-a2f983a0c941 req-f10ee001-58cc-4a8c-9b65-89bfd401f91b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.438 2 DEBUG oslo_concurrency.lockutils [req-fdc66bc6-9826-4a53-a18d-a2f983a0c941 req-f10ee001-58cc-4a8c-9b65-89bfd401f91b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.438 2 DEBUG oslo_concurrency.lockutils [req-fdc66bc6-9826-4a53-a18d-a2f983a0c941 req-f10ee001-58cc-4a8c-9b65-89bfd401f91b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.438 2 DEBUG nova.compute.manager [req-fdc66bc6-9826-4a53-a18d-a2f983a0c941 req-f10ee001-58cc-4a8c-9b65-89bfd401f91b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] No waiting events found dispatching network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.439 2 WARNING nova.compute.manager [req-fdc66bc6-9826-4a53-a18d-a2f983a0c941 req-f10ee001-58cc-4a8c-9b65-89bfd401f91b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received unexpected event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c for instance with vm_state resized and task_state None.#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.737 2 DEBUG neutronclient.v2_0.client [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 1b5b60bd-2531-4381-84cd-eb569ec9274c for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.738 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.738 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquired lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.738 2 DEBUG nova.network.neutron [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.739 2 DEBUG nova.objects.instance [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'info_cache' on Instance uuid a1baa49c-f428-4e4d-801c-abc2136158a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.740 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759555972.7092505, b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.740 2 INFO nova.compute.manager [-] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.819 2 DEBUG nova.compute.manager [None req-254f433d-38e0-45de-b81e-d1c146b8c613 - - - - - -] [instance: b7beeabd-6a5f-40d8-8dfe-b9ccd1d74951] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.959 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.959 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:07 np0005470441 nova_compute[192626]: 2025-10-04 05:33:07.977 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.062 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.062 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.069 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.069 2 INFO nova.compute.claims [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.260 2 DEBUG nova.compute.provider_tree [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.282 2 DEBUG nova.scheduler.client.report [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.312 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.249s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.313 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.371 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.371 2 DEBUG nova.network.neutron [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.424 2 INFO nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.464 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.688 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.689 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.689 2 INFO nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Creating image(s)#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.690 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "/var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.690 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "/var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.691 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "/var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.702 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.754 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.755 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.756 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.766 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.859 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.860 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.888 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.889 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.889 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.944 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.945 2 DEBUG nova.virt.disk.api [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Checking if we can resize image /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.945 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.962 2 DEBUG nova.policy [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d287f06015a40ac9c85190888828f26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b6962138a4941b79cb3bac6166c1b8a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:33:08 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.999 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:08.999 2 DEBUG nova.virt.disk.api [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Cannot resize image /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:09.000 2 DEBUG nova.objects.instance [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lazy-loading 'migration_context' on Instance uuid e2fc47c6-9030-42b5-9a97-5c3c992f04a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:09.021 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:09.021 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Ensure instance console log exists: /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:09.022 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:09.022 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:09 np0005470441 nova_compute[192626]: 2025-10-04 05:33:09.022 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.460 2 DEBUG nova.network.neutron [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Updating instance_info_cache with network_info: [{"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.503 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Releasing lock "refresh_cache-a1baa49c-f428-4e4d-801c-abc2136158a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.504 2 DEBUG nova.objects.instance [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'migration_context' on Instance uuid a1baa49c-f428-4e4d-801c-abc2136158a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.542 2 DEBUG nova.virt.libvirt.vif [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:31:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-675739622',display_name='tempest-TestNetworkAdvancedServerOps-server-675739622',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-675739622',id=2,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP5+CRSM3/W/3cdhXTvJjWK5UqBbkM4aujgf+ON1jBkOSGjcEuxVD5W29TRWk+OUAt6wyZdunDlHRBm9PNDcqsoaG2HxeOcc3JYO5cd3/bCy4UrUPgcVg69owsQ+Gy2tQQ==',key_name='tempest-TestNetworkAdvancedServerOps-2087633968',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:33:03Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-vf0w3tlq',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:33:03Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=a1baa49c-f428-4e4d-801c-abc2136158a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.543 2 DEBUG nova.network.os_vif_util [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "address": "fa:16:3e:d1:d1:e0", "network": {"id": "1672dacf-b95d-4a80-9b7d-b30bde70ba8b", "bridge": "br-int", "label": "tempest-network-smoke--1651115047", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b5b60bd-25", "ovs_interfaceid": "1b5b60bd-2531-4381-84cd-eb569ec9274c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.544 2 DEBUG nova.network.os_vif_util [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.544 2 DEBUG os_vif [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b5b60bd-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.547 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.552 2 INFO os_vif [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:d1:e0,bridge_name='br-int',has_traffic_filtering=True,id=1b5b60bd-2531-4381-84cd-eb569ec9274c,network=Network(1672dacf-b95d-4a80-9b7d-b30bde70ba8b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b5b60bd-25')#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.552 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.553 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.652 2 DEBUG nova.compute.manager [req-69cbd956-8767-42b0-a6b4-f115f96d4013 req-3d50cdd4-632f-4d8c-9327-5f499ea67fbf 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.653 2 DEBUG oslo_concurrency.lockutils [req-69cbd956-8767-42b0-a6b4-f115f96d4013 req-3d50cdd4-632f-4d8c-9327-5f499ea67fbf 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.653 2 DEBUG oslo_concurrency.lockutils [req-69cbd956-8767-42b0-a6b4-f115f96d4013 req-3d50cdd4-632f-4d8c-9327-5f499ea67fbf 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.653 2 DEBUG oslo_concurrency.lockutils [req-69cbd956-8767-42b0-a6b4-f115f96d4013 req-3d50cdd4-632f-4d8c-9327-5f499ea67fbf 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.653 2 DEBUG nova.compute.manager [req-69cbd956-8767-42b0-a6b4-f115f96d4013 req-3d50cdd4-632f-4d8c-9327-5f499ea67fbf 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] No waiting events found dispatching network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.653 2 WARNING nova.compute.manager [req-69cbd956-8767-42b0-a6b4-f115f96d4013 req-3d50cdd4-632f-4d8c-9327-5f499ea67fbf 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: a1baa49c-f428-4e4d-801c-abc2136158a1] Received unexpected event network-vif-plugged-1b5b60bd-2531-4381-84cd-eb569ec9274c for instance with vm_state resized and task_state None.#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.716 2 DEBUG nova.compute.provider_tree [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.741 2 DEBUG nova.scheduler.client.report [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.766 2 DEBUG nova.network.neutron [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Successfully created port: 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.808 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:10 np0005470441 nova_compute[192626]: 2025-10-04 05:33:10.929 2 INFO nova.scheduler.client.report [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Deleted allocation for migration 99e4cd77-1340-4221-aff0-a61c9a0f4eaa#033[00m
Oct  4 01:33:11 np0005470441 nova_compute[192626]: 2025-10-04 05:33:11.012 2 DEBUG oslo_concurrency.lockutils [None req-1331a92f-7cf3-4015-a888-7c339da15fa2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "a1baa49c-f428-4e4d-801c-abc2136158a1" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:11 np0005470441 nova_compute[192626]: 2025-10-04 05:33:11.855 2 DEBUG nova.network.neutron [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Successfully updated port: 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:33:11 np0005470441 nova_compute[192626]: 2025-10-04 05:33:11.868 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:11 np0005470441 nova_compute[192626]: 2025-10-04 05:33:11.868 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquired lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:11 np0005470441 nova_compute[192626]: 2025-10-04 05:33:11.869 2 DEBUG nova.network.neutron [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:33:12 np0005470441 nova_compute[192626]: 2025-10-04 05:33:12.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:12 np0005470441 podman[221728]: 2025-10-04 05:33:12.29278092 +0000 UTC m=+0.047658413 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:33:12 np0005470441 podman[221729]: 2025-10-04 05:33:12.3094031 +0000 UTC m=+0.060415700 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:33:12 np0005470441 nova_compute[192626]: 2025-10-04 05:33:12.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:13 np0005470441 nova_compute[192626]: 2025-10-04 05:33:13.137 2 DEBUG nova.network.neutron [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:33:13 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:13.623 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:14 np0005470441 nova_compute[192626]: 2025-10-04 05:33:14.405 2 DEBUG nova.compute.manager [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-changed-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:14 np0005470441 nova_compute[192626]: 2025-10-04 05:33:14.406 2 DEBUG nova.compute.manager [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Refreshing instance network info cache due to event network-changed-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:33:14 np0005470441 nova_compute[192626]: 2025-10-04 05:33:14.406 2 DEBUG oslo_concurrency.lockutils [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.135 2 DEBUG nova.network.neutron [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updating instance_info_cache with network_info: [{"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.226 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Releasing lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.226 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Instance network_info: |[{"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.227 2 DEBUG oslo_concurrency.lockutils [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.227 2 DEBUG nova.network.neutron [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Refreshing network info cache for port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.230 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Start _get_guest_xml network_info=[{"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.235 2 WARNING nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.241 2 DEBUG nova.virt.libvirt.host [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.241 2 DEBUG nova.virt.libvirt.host [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.246 2 DEBUG nova.virt.libvirt.host [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.246 2 DEBUG nova.virt.libvirt.host [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.248 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.248 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.248 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.248 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.249 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.249 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.249 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.249 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.250 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.250 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.250 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.250 2 DEBUG nova.virt.hardware [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.253 2 DEBUG nova.virt.libvirt.vif [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:33:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1687927062-ac',id=11,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKBOJjhX3sVJr/AcqYf1Kyiiq7grbD5SPm0FNcFfrb9TEEPE7/lsG5r9+3t5XixXI+9OaQd3W8glvARe/vIhhqJq/Jo2/EstUXHfW0Fz0J+PxhKephg3bVCE7I7vxMRAmQ==',key_name='tempest-TestSecurityGroupsBasicOps-2030986510',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b6962138a4941b79cb3bac6166c1b8a',ramdisk_id='',reservation_id='r-3xhrh88o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1687927062',owner_user_name='tempest-TestSecurityGroupsBasicOps-1687927062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:33:08Z,user_data=None,user_id='9d287f06015a40ac9c85190888828f26',uuid=e2fc47c6-9030-42b5-9a97-5c3c992f04a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.254 2 DEBUG nova.network.os_vif_util [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Converting VIF {"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.254 2 DEBUG nova.network.os_vif_util [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.255 2 DEBUG nova.objects.instance [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lazy-loading 'pci_devices' on Instance uuid e2fc47c6-9030-42b5-9a97-5c3c992f04a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.367 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <uuid>e2fc47c6-9030-42b5-9a97-5c3c992f04a9</uuid>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <name>instance-0000000b</name>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263</nova:name>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:33:15</nova:creationTime>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:user uuid="9d287f06015a40ac9c85190888828f26">tempest-TestSecurityGroupsBasicOps-1687927062-project-member</nova:user>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:project uuid="5b6962138a4941b79cb3bac6166c1b8a">tempest-TestSecurityGroupsBasicOps-1687927062</nova:project>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        <nova:port uuid="8300d0dd-8dcc-4c1b-85a8-fe35e71cea24">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <entry name="serial">e2fc47c6-9030-42b5-9a97-5c3c992f04a9</entry>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <entry name="uuid">e2fc47c6-9030-42b5-9a97-5c3c992f04a9</entry>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.config"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:c1:b8:1d"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <target dev="tap8300d0dd-8d"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/console.log" append="off"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:33:15 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:33:15 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:33:15 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:33:15 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.369 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Preparing to wait for external event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.369 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.370 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.370 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.371 2 DEBUG nova.virt.libvirt.vif [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:33:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1687927062-ac',id=11,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKBOJjhX3sVJr/AcqYf1Kyiiq7grbD5SPm0FNcFfrb9TEEPE7/lsG5r9+3t5XixXI+9OaQd3W8glvARe/vIhhqJq/Jo2/EstUXHfW0Fz0J+PxhKephg3bVCE7I7vxMRAmQ==',key_name='tempest-TestSecurityGroupsBasicOps-2030986510',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b6962138a4941b79cb3bac6166c1b8a',ramdisk_id='',reservation_id='r-3xhrh88o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1687927062',owner_user_name='tempest-TestSecurityGroupsBasicOps-1687927062-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:33:08Z,user_data=None,user_id='9d287f06015a40ac9c85190888828f26',uuid=e2fc47c6-9030-42b5-9a97-5c3c992f04a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.371 2 DEBUG nova.network.os_vif_util [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Converting VIF {"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.372 2 DEBUG nova.network.os_vif_util [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.372 2 DEBUG os_vif [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.373 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.377 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8300d0dd-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.378 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8300d0dd-8d, col_values=(('external_ids', {'iface-id': '8300d0dd-8dcc-4c1b-85a8-fe35e71cea24', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:b8:1d', 'vm-uuid': 'e2fc47c6-9030-42b5-9a97-5c3c992f04a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:15 np0005470441 NetworkManager[51690]: <info>  [1759555995.3807] manager: (tap8300d0dd-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.385 2 INFO os_vif [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d')#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.594 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.595 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.595 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] No VIF found with MAC fa:16:3e:c1:b8:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:33:15 np0005470441 nova_compute[192626]: 2025-10-04 05:33:15.595 2 INFO nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Using config drive#033[00m
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.495 2 INFO nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Creating config drive at /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.config#033[00m
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.499 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpptrzil71 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.623 2 DEBUG oslo_concurrency.processutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpptrzil71" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:17 np0005470441 NetworkManager[51690]: <info>  [1759555997.6827] manager: (tap8300d0dd-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Oct  4 01:33:17 np0005470441 kernel: tap8300d0dd-8d: entered promiscuous mode
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:17Z|00074|binding|INFO|Claiming lport 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 for this chassis.
Oct  4 01:33:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:17Z|00075|binding|INFO|8300d0dd-8dcc-4c1b-85a8-fe35e71cea24: Claiming fa:16:3e:c1:b8:1d 10.100.0.13
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.707 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:b8:1d 10.100.0.13'], port_security=['fa:16:3e:c1:b8:1d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e2fc47c6-9030-42b5-9a97-5c3c992f04a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b6962138a4941b79cb3bac6166c1b8a', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c6316be2-c97a-42d0-8b3f-4c8643bc20a2 d2a07532-451f-44ba-b9e3-5a3595e7433a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9788ad4d-f756-4f42-a6fd-eab97c574bc9, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.709 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 in datapath 7c191ef1-10b4-48e9-a8f5-106ae79eac48 bound to our chassis#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.711 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7c191ef1-10b4-48e9-a8f5-106ae79eac48#033[00m
Oct  4 01:33:17 np0005470441 systemd-machined[152624]: New machine qemu-5-instance-0000000b.
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.724 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4bfe728e-0a03-4876-94ac-9214070d465c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.725 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7c191ef1-11 in ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.727 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7c191ef1-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.727 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d62d43-8c87-40dd-bbc1-ce3d7e1de038]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.728 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1657747e-4f6d-4135-9b00-38748b903fd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.738 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[724c325a-3118-4713-8918-f41c5ea68d05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:17Z|00076|binding|INFO|Setting lport 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 ovn-installed in OVS
Oct  4 01:33:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:17Z|00077|binding|INFO|Setting lport 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 up in Southbound
Oct  4 01:33:17 np0005470441 nova_compute[192626]: 2025-10-04 05:33:17.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.794 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2a8e3988-386f-46c9-93c9-e06aaf32b0cc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 systemd[1]: Started Virtual Machine qemu-5-instance-0000000b.
Oct  4 01:33:17 np0005470441 systemd-udevd[221790]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.826 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[98630d66-1d35-49cb-8923-01476a2a15fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 systemd-udevd[221796]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.833 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9846d52d-c6c3-49ef-b5db-259e7b779564]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 NetworkManager[51690]: <info>  [1759555997.8364] manager: (tap7c191ef1-10): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Oct  4 01:33:17 np0005470441 NetworkManager[51690]: <info>  [1759555997.8375] device (tap8300d0dd-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:33:17 np0005470441 NetworkManager[51690]: <info>  [1759555997.8389] device (tap8300d0dd-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.866 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a377cd86-7f6a-45ed-a6f1-a05d5ce31adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.868 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[d28c994f-b9ab-4f32-8e49-cd54359150de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 NetworkManager[51690]: <info>  [1759555997.8905] device (tap7c191ef1-10): carrier: link connected
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.896 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[7346bc99-a07a-4dea-98c5-84ca51fae735]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.911 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4470d8f4-4c30-44af-a33f-d3388bbe3048]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7c191ef1-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:81:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389554, 'reachable_time': 27819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221817, 'error': None, 'target': 'ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.925 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[467882fb-c216-4fcd-8ec6-fa036c4a0a3b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:818e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389554, 'tstamp': 389554}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221818, 'error': None, 'target': 'ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.942 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[dd69772d-0663-404e-9039-03e5cfca92c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7c191ef1-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:81:8e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389554, 'reachable_time': 27819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221819, 'error': None, 'target': 'ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:17.972 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cd749ec4-227e-4791-b36a-eac874b29a66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.031 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b45b705e-de37-483b-8a86-af278c8e8823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.033 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c191ef1-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.034 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.034 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c191ef1-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:18 np0005470441 NetworkManager[51690]: <info>  [1759555998.0376] manager: (tap7c191ef1-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct  4 01:33:18 np0005470441 kernel: tap7c191ef1-10: entered promiscuous mode
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.040 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7c191ef1-10, col_values=(('external_ids', {'iface-id': '0fd1ac37-345c-439d-a828-ea94841fd08a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:18 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:18Z|00078|binding|INFO|Releasing lport 0fd1ac37-345c-439d-a828-ea94841fd08a from this chassis (sb_readonly=0)
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.052 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7c191ef1-10b4-48e9-a8f5-106ae79eac48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7c191ef1-10b4-48e9-a8f5-106ae79eac48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.053 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcea7f1-7431-4b4f-83c7-7e27adad01eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.054 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-7c191ef1-10b4-48e9-a8f5-106ae79eac48
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/7c191ef1-10b4-48e9-a8f5-106ae79eac48.pid.haproxy
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 7c191ef1-10b4-48e9-a8f5-106ae79eac48
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:33:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:18.055 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'env', 'PROCESS_TAG=haproxy-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7c191ef1-10b4-48e9-a8f5-106ae79eac48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:33:18 np0005470441 podman[221858]: 2025-10-04 05:33:18.416745826 +0000 UTC m=+0.025762348 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:33:18 np0005470441 podman[221858]: 2025-10-04 05:33:18.619295577 +0000 UTC m=+0.228312079 container create 0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  4 01:33:18 np0005470441 systemd[1]: Started libpod-conmon-0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b.scope.
Oct  4 01:33:18 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:33:18 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dedd0906d2143c083623ec6881e6b2f4be3466370286a257750c00902f141ea8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:33:18 np0005470441 podman[221858]: 2025-10-04 05:33:18.711690108 +0000 UTC m=+0.320706610 container init 0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  4 01:33:18 np0005470441 podman[221858]: 2025-10-04 05:33:18.717797498 +0000 UTC m=+0.326814000 container start 0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.729 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759555998.7283728, e2fc47c6-9030-42b5-9a97-5c3c992f04a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.729 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] VM Started (Lifecycle Event)#033[00m
Oct  4 01:33:18 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [NOTICE]   (221877) : New worker (221879) forked
Oct  4 01:33:18 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [NOTICE]   (221877) : Loading success.
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.790 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.796 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759555998.7294292, e2fc47c6-9030-42b5-9a97-5c3c992f04a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.796 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.825 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.829 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:33:18 np0005470441 nova_compute[192626]: 2025-10-04 05:33:18.856 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:33:20 np0005470441 nova_compute[192626]: 2025-10-04 05:33:20.041 2 DEBUG nova.network.neutron [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updated VIF entry in instance network info cache for port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:33:20 np0005470441 nova_compute[192626]: 2025-10-04 05:33:20.041 2 DEBUG nova.network.neutron [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updating instance_info_cache with network_info: [{"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:20 np0005470441 nova_compute[192626]: 2025-10-04 05:33:20.070 2 DEBUG oslo_concurrency.lockutils [req-858b8692-ff0c-44c7-85e0-0ef946262568 req-31294397-81e8-4359-b3cb-70e8d177324b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:20 np0005470441 podman[221888]: 2025-10-04 05:33:20.371486357 +0000 UTC m=+0.118678386 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm)
Oct  4 01:33:20 np0005470441 nova_compute[192626]: 2025-10-04 05:33:20.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.893 2 DEBUG nova.compute.manager [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.894 2 DEBUG oslo_concurrency.lockutils [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.894 2 DEBUG oslo_concurrency.lockutils [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.894 2 DEBUG oslo_concurrency.lockutils [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.894 2 DEBUG nova.compute.manager [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Processing event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.894 2 DEBUG nova.compute.manager [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.894 2 DEBUG oslo_concurrency.lockutils [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.895 2 DEBUG oslo_concurrency.lockutils [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.895 2 DEBUG oslo_concurrency.lockutils [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.895 2 DEBUG nova.compute.manager [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] No waiting events found dispatching network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.895 2 WARNING nova.compute.manager [req-ec089960-3003-45b4-a89b-1a62b80ffaf4 req-63d7622a-2583-48a8-8e0f-6a05611e2a2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received unexpected event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 for instance with vm_state building and task_state spawning.#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.896 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.899 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556001.8992481, e2fc47c6-9030-42b5-9a97-5c3c992f04a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.899 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.901 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.904 2 INFO nova.virt.libvirt.driver [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Instance spawned successfully.#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.904 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.972 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.978 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.981 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.981 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.982 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.982 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.983 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:33:21 np0005470441 nova_compute[192626]: 2025-10-04 05:33:21.983 2 DEBUG nova.virt.libvirt.driver [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:33:22 np0005470441 nova_compute[192626]: 2025-10-04 05:33:22.029 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:33:22 np0005470441 nova_compute[192626]: 2025-10-04 05:33:22.078 2 INFO nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Took 13.39 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:33:22 np0005470441 nova_compute[192626]: 2025-10-04 05:33:22.079 2 DEBUG nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:22 np0005470441 nova_compute[192626]: 2025-10-04 05:33:22.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:22 np0005470441 nova_compute[192626]: 2025-10-04 05:33:22.171 2 INFO nova.compute.manager [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Took 14.14 seconds to build instance.#033[00m
Oct  4 01:33:22 np0005470441 nova_compute[192626]: 2025-10-04 05:33:22.193 2 DEBUG oslo_concurrency.lockutils [None req-04291190-3f30-470d-9f87-3df23185df45 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.234s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:23 np0005470441 podman[221909]: 2025-10-04 05:33:23.292037991 +0000 UTC m=+0.048558020 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:33:25 np0005470441 nova_compute[192626]: 2025-10-04 05:33:25.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:26 np0005470441 podman[221933]: 2025-10-04 05:33:26.316828006 +0000 UTC m=+0.067276489 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:33:27 np0005470441 nova_compute[192626]: 2025-10-04 05:33:27.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:28 np0005470441 podman[221950]: 2025-10-04 05:33:28.332107063 +0000 UTC m=+0.084565513 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:33:30 np0005470441 nova_compute[192626]: 2025-10-04 05:33:30.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:30 np0005470441 nova_compute[192626]: 2025-10-04 05:33:30.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:30 np0005470441 NetworkManager[51690]: <info>  [1759556010.7886] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct  4 01:33:30 np0005470441 NetworkManager[51690]: <info>  [1759556010.7899] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct  4 01:33:31 np0005470441 nova_compute[192626]: 2025-10-04 05:33:31.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:31 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:31Z|00079|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:33:31 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:31Z|00080|binding|INFO|Releasing lport 0fd1ac37-345c-439d-a828-ea94841fd08a from this chassis (sb_readonly=0)
Oct  4 01:33:31 np0005470441 nova_compute[192626]: 2025-10-04 05:33:31.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:32 np0005470441 nova_compute[192626]: 2025-10-04 05:33:32.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.063 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Creating tmpfile /var/lib/nova/instances/tmpqktyeed0 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.064 2 DEBUG nova.compute.manager [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqktyeed0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.276 2 DEBUG nova.compute.manager [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-changed-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.276 2 DEBUG nova.compute.manager [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Refreshing instance network info cache due to event network-changed-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.277 2 DEBUG oslo_concurrency.lockutils [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.278 2 DEBUG oslo_concurrency.lockutils [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:33 np0005470441 nova_compute[192626]: 2025-10-04 05:33:33.278 2 DEBUG nova.network.neutron [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Refreshing network info cache for port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:33:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:34Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:b8:1d 10.100.0.13
Oct  4 01:33:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:34Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:b8:1d 10.100.0.13
Oct  4 01:33:35 np0005470441 nova_compute[192626]: 2025-10-04 05:33:35.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:35 np0005470441 nova_compute[192626]: 2025-10-04 05:33:35.533 2 DEBUG nova.compute.manager [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqktyeed0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='286feccf-0ffd-498c-8db5-7128a3d0f965',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  4 01:33:35 np0005470441 nova_compute[192626]: 2025-10-04 05:33:35.584 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "refresh_cache-286feccf-0ffd-498c-8db5-7128a3d0f965" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:35 np0005470441 nova_compute[192626]: 2025-10-04 05:33:35.585 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquired lock "refresh_cache-286feccf-0ffd-498c-8db5-7128a3d0f965" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:35 np0005470441 nova_compute[192626]: 2025-10-04 05:33:35.585 2 DEBUG nova.network.neutron [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:33:37 np0005470441 nova_compute[192626]: 2025-10-04 05:33:37.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:37 np0005470441 nova_compute[192626]: 2025-10-04 05:33:37.477 2 DEBUG nova.network.neutron [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updated VIF entry in instance network info cache for port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:33:37 np0005470441 nova_compute[192626]: 2025-10-04 05:33:37.478 2 DEBUG nova.network.neutron [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updating instance_info_cache with network_info: [{"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:37 np0005470441 nova_compute[192626]: 2025-10-04 05:33:37.511 2 DEBUG oslo_concurrency.lockutils [req-6d94e59a-eec4-44c2-8b8f-c4d7818d6a6a req-ccc793e3-2191-4824-b162-122cb9b7f57f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:38 np0005470441 podman[222009]: 2025-10-04 05:33:38.297392447 +0000 UTC m=+0.043935592 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:33:38 np0005470441 podman[222008]: 2025-10-04 05:33:38.323934332 +0000 UTC m=+0.078133265 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.554 2 DEBUG nova.network.neutron [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Updating instance_info_cache with network_info: [{"id": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "address": "fa:16:3e:4e:8d:c7", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82ee3ec-eb", "ovs_interfaceid": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.573 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Releasing lock "refresh_cache-286feccf-0ffd-498c-8db5-7128a3d0f965" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.574 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqktyeed0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='286feccf-0ffd-498c-8db5-7128a3d0f965',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.575 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Creating instance directory: /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.575 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Creating disk.info with the contents: {'/var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk': 'qcow2', '/var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.576 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.576 2 DEBUG nova.objects.instance [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lazy-loading 'trusted_certs' on Instance uuid 286feccf-0ffd-498c-8db5-7128a3d0f965 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.610 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.669 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.670 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.671 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.685 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.742 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.743 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.892 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk 1073741824" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.894 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.223s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.894 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.953 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.954 2 DEBUG nova.virt.disk.api [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Checking if we can resize image /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:33:38 np0005470441 nova_compute[192626]: 2025-10-04 05:33:38.955 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.013 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.014 2 DEBUG nova.virt.disk.api [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Cannot resize image /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.014 2 DEBUG nova.objects.instance [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lazy-loading 'migration_context' on Instance uuid 286feccf-0ffd-498c-8db5-7128a3d0f965 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.031 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.063 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk.config 485376" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.064 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Copying file compute-2.ctlplane.example.com:/var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk.config to /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.065 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk.config /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.531 2 DEBUG oslo_concurrency.processutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "scp -C -r compute-2.ctlplane.example.com:/var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk.config /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.532 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.533 2 DEBUG nova.virt.libvirt.vif [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:33:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1864739972',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1864739972',id=10,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:33:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=3,progress=0,project_id='0a30d290b7ef45f3ade527507f03ce55',ramdisk_id='',reservation_id='r-rxx7vjpi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-869616',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-869616-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:33:27Z,user_data=None,user_id='174330e695c64fc1ac9d921e330c5642',uuid=286feccf-0ffd-498c-8db5-7128a3d0f965,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "address": "fa:16:3e:4e:8d:c7", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape82ee3ec-eb", "ovs_interfaceid": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.534 2 DEBUG nova.network.os_vif_util [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Converting VIF {"id": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "address": "fa:16:3e:4e:8d:c7", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape82ee3ec-eb", "ovs_interfaceid": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.534 2 DEBUG nova.network.os_vif_util [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:8d:c7,bridge_name='br-int',has_traffic_filtering=True,id=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape82ee3ec-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.535 2 DEBUG os_vif [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:8d:c7,bridge_name='br-int',has_traffic_filtering=True,id=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape82ee3ec-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.536 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.536 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape82ee3ec-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.539 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape82ee3ec-eb, col_values=(('external_ids', {'iface-id': 'e82ee3ec-eb7b-4866-97bb-a0e71ab7a510', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:8d:c7', 'vm-uuid': '286feccf-0ffd-498c-8db5-7128a3d0f965'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:39 np0005470441 NetworkManager[51690]: <info>  [1759556019.5423] manager: (tape82ee3ec-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.551 2 INFO os_vif [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:8d:c7,bridge_name='br-int',has_traffic_filtering=True,id=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape82ee3ec-eb')#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.551 2 DEBUG nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.552 2 DEBUG nova.compute.manager [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqktyeed0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='286feccf-0ffd-498c-8db5-7128a3d0f965',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  4 01:33:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:39Z|00081|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:33:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:39Z|00082|binding|INFO|Releasing lport 0fd1ac37-345c-439d-a828-ea94841fd08a from this chassis (sb_readonly=0)
Oct  4 01:33:39 np0005470441 nova_compute[192626]: 2025-10-04 05:33:39.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:41 np0005470441 nova_compute[192626]: 2025-10-04 05:33:41.717 2 DEBUG nova.network.neutron [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Port e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  4 01:33:41 np0005470441 nova_compute[192626]: 2025-10-04 05:33:41.719 2 DEBUG nova.compute.manager [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpqktyeed0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='286feccf-0ffd-498c-8db5-7128a3d0f965',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  4 01:33:41 np0005470441 kernel: tape82ee3ec-eb: entered promiscuous mode
Oct  4 01:33:41 np0005470441 NetworkManager[51690]: <info>  [1759556021.9643] manager: (tape82ee3ec-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Oct  4 01:33:41 np0005470441 nova_compute[192626]: 2025-10-04 05:33:41.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:41Z|00083|binding|INFO|Claiming lport e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 for this additional chassis.
Oct  4 01:33:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:41Z|00084|binding|INFO|e82ee3ec-eb7b-4866-97bb-a0e71ab7a510: Claiming fa:16:3e:4e:8d:c7 10.100.0.3
Oct  4 01:33:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:41Z|00085|binding|INFO|Setting lport e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 ovn-installed in OVS
Oct  4 01:33:41 np0005470441 nova_compute[192626]: 2025-10-04 05:33:41.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:41 np0005470441 nova_compute[192626]: 2025-10-04 05:33:41.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:41 np0005470441 systemd-udevd[222088]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:33:42 np0005470441 systemd-machined[152624]: New machine qemu-6-instance-0000000a.
Oct  4 01:33:42 np0005470441 NetworkManager[51690]: <info>  [1759556022.0106] device (tape82ee3ec-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:33:42 np0005470441 NetworkManager[51690]: <info>  [1759556022.0119] device (tape82ee3ec-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:33:42 np0005470441 systemd[1]: Started Virtual Machine qemu-6-instance-0000000a.
Oct  4 01:33:42 np0005470441 nova_compute[192626]: 2025-10-04 05:33:42.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:42 np0005470441 nova_compute[192626]: 2025-10-04 05:33:42.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:42 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:42Z|00086|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:33:42 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:42Z|00087|binding|INFO|Releasing lport 0fd1ac37-345c-439d-a828-ea94841fd08a from this chassis (sb_readonly=0)
Oct  4 01:33:42 np0005470441 nova_compute[192626]: 2025-10-04 05:33:42.944 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:43 np0005470441 podman[222106]: 2025-10-04 05:33:43.328541415 +0000 UTC m=+0.060176034 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:33:43 np0005470441 podman[222105]: 2025-10-04 05:33:43.346324331 +0000 UTC m=+0.087539163 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  4 01:33:43 np0005470441 nova_compute[192626]: 2025-10-04 05:33:43.713 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556023.7135653, 286feccf-0ffd-498c-8db5-7128a3d0f965 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:43 np0005470441 nova_compute[192626]: 2025-10-04 05:33:43.714 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] VM Started (Lifecycle Event)#033[00m
Oct  4 01:33:43 np0005470441 nova_compute[192626]: 2025-10-04 05:33:43.734 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:44 np0005470441 nova_compute[192626]: 2025-10-04 05:33:44.108 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556024.108231, 286feccf-0ffd-498c-8db5-7128a3d0f965 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:33:44 np0005470441 nova_compute[192626]: 2025-10-04 05:33:44.108 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:33:44 np0005470441 nova_compute[192626]: 2025-10-04 05:33:44.131 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:44 np0005470441 nova_compute[192626]: 2025-10-04 05:33:44.134 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: paused, current task_state: migrating, current DB power_state: 3, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:33:44 np0005470441 nova_compute[192626]: 2025-10-04 05:33:44.160 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] During the sync_power process the instance has moved from host compute-2.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  4 01:33:44 np0005470441 nova_compute[192626]: 2025-10-04 05:33:44.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:46 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:46Z|00088|binding|INFO|Claiming lport e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 for this chassis.
Oct  4 01:33:46 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:46Z|00089|binding|INFO|e82ee3ec-eb7b-4866-97bb-a0e71ab7a510: Claiming fa:16:3e:4e:8d:c7 10.100.0.3
Oct  4 01:33:46 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:46Z|00090|binding|INFO|Setting lport e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 up in Southbound
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.256 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:8d:c7 10.100.0.3'], port_security=['fa:16:3e:4e:8d:c7 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a984030f-c569-4bd0-83e0-9a6812d06f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '11', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfeb7d9d-6193-40b2-b586-fa0e6ac8f060, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.258 103689 INFO neutron.agent.ovn.metadata.agent [-] Port e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 in datapath a984030f-c569-4bd0-83e0-9a6812d06f48 bound to our chassis#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.260 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a984030f-c569-4bd0-83e0-9a6812d06f48#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.274 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9db3965c-9b68-46b8-b228-8246e4cc21ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.303 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[8af59a30-3ab7-4678-8bda-ad25ee770791]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.307 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a2d103ef-f5f9-4249-b41d-81ccbc2878b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.338 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[edc9415d-6172-400e-9942-da7d0e087cac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.356 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf43704-c82b-4356-bbe8-284760cc8ee4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa984030f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:44:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 23, 'tx_packets': 5, 'rx_bytes': 1462, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 23, 'tx_packets': 5, 'rx_bytes': 1462, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386672, 'reachable_time': 33947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222152, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.372 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[403f330e-6ceb-4305-a530-bb63ace90836]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386682, 'tstamp': 386682}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222153, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386685, 'tstamp': 386685}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222153, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.374 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa984030f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:46 np0005470441 nova_compute[192626]: 2025-10-04 05:33:46.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:46 np0005470441 nova_compute[192626]: 2025-10-04 05:33:46.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.377 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa984030f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.378 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.378 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa984030f-c0, col_values=(('external_ids', {'iface-id': '3ea6f406-5ff7-4b46-9301-f23ee9be4b86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:46 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:46.379 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:33:46 np0005470441 nova_compute[192626]: 2025-10-04 05:33:46.502 2 INFO nova.compute.manager [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Post operation of migration started#033[00m
Oct  4 01:33:47 np0005470441 nova_compute[192626]: 2025-10-04 05:33:47.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:47 np0005470441 nova_compute[192626]: 2025-10-04 05:33:47.399 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "refresh_cache-286feccf-0ffd-498c-8db5-7128a3d0f965" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:47 np0005470441 nova_compute[192626]: 2025-10-04 05:33:47.400 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquired lock "refresh_cache-286feccf-0ffd-498c-8db5-7128a3d0f965" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:47 np0005470441 nova_compute[192626]: 2025-10-04 05:33:47.400 2 DEBUG nova.network.neutron [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:33:48 np0005470441 nova_compute[192626]: 2025-10-04 05:33:48.679 2 DEBUG nova.network.neutron [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Updating instance_info_cache with network_info: [{"id": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "address": "fa:16:3e:4e:8d:c7", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82ee3ec-eb", "ovs_interfaceid": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:48 np0005470441 nova_compute[192626]: 2025-10-04 05:33:48.710 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Releasing lock "refresh_cache-286feccf-0ffd-498c-8db5-7128a3d0f965" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:48 np0005470441 nova_compute[192626]: 2025-10-04 05:33:48.731 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:48 np0005470441 nova_compute[192626]: 2025-10-04 05:33:48.731 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:48 np0005470441 nova_compute[192626]: 2025-10-04 05:33:48.732 2 DEBUG oslo_concurrency.lockutils [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:48 np0005470441 nova_compute[192626]: 2025-10-04 05:33:48.736 2 INFO nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  4 01:33:48 np0005470441 virtqemud[192168]: Domain id=6 name='instance-0000000a' uuid=286feccf-0ffd-498c-8db5-7128a3d0f965 is tainted: custom-monitor
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.122 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.123 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.123 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.123 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.123 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.124 2 INFO nova.compute.manager [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Terminating instance#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.125 2 DEBUG nova.compute.manager [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:33:49 np0005470441 kernel: tap8300d0dd-8d (unregistering): left promiscuous mode
Oct  4 01:33:49 np0005470441 NetworkManager[51690]: <info>  [1759556029.1554] device (tap8300d0dd-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:49Z|00091|binding|INFO|Releasing lport 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 from this chassis (sb_readonly=0)
Oct  4 01:33:49 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:49Z|00092|binding|INFO|Setting lport 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 down in Southbound
Oct  4 01:33:49 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:49Z|00093|binding|INFO|Removing iface tap8300d0dd-8d ovn-installed in OVS
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.182 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:b8:1d 10.100.0.13'], port_security=['fa:16:3e:c1:b8:1d 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'e2fc47c6-9030-42b5-9a97-5c3c992f04a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b6962138a4941b79cb3bac6166c1b8a', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c6316be2-c97a-42d0-8b3f-4c8643bc20a2 d2a07532-451f-44ba-b9e3-5a3595e7433a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9788ad4d-f756-4f42-a6fd-eab97c574bc9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.183 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 in datapath 7c191ef1-10b4-48e9-a8f5-106ae79eac48 unbound from our chassis#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.185 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7c191ef1-10b4-48e9-a8f5-106ae79eac48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.186 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2529b586-b520-4d4d-8b19-8859e1a14b71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.186 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48 namespace which is not needed anymore#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.195 2 DEBUG nova.compute.manager [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-changed-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.195 2 DEBUG nova.compute.manager [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Refreshing instance network info cache due to event network-changed-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.196 2 DEBUG oslo_concurrency.lockutils [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.196 2 DEBUG oslo_concurrency.lockutils [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.196 2 DEBUG nova.network.neutron [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Refreshing network info cache for port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct  4 01:33:49 np0005470441 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000b.scope: Consumed 13.384s CPU time.
Oct  4 01:33:49 np0005470441 systemd-machined[152624]: Machine qemu-5-instance-0000000b terminated.
Oct  4 01:33:49 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [NOTICE]   (221877) : haproxy version is 2.8.14-c23fe91
Oct  4 01:33:49 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [NOTICE]   (221877) : path to executable is /usr/sbin/haproxy
Oct  4 01:33:49 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [WARNING]  (221877) : Exiting Master process...
Oct  4 01:33:49 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [ALERT]    (221877) : Current worker (221879) exited with code 143 (Terminated)
Oct  4 01:33:49 np0005470441 neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48[221873]: [WARNING]  (221877) : All workers exited. Exiting... (0)
Oct  4 01:33:49 np0005470441 systemd[1]: libpod-0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b.scope: Deactivated successfully.
Oct  4 01:33:49 np0005470441 podman[222177]: 2025-10-04 05:33:49.320933118 +0000 UTC m=+0.041576975 container died 0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:33:49 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b-userdata-shm.mount: Deactivated successfully.
Oct  4 01:33:49 np0005470441 systemd[1]: var-lib-containers-storage-overlay-dedd0906d2143c083623ec6881e6b2f4be3466370286a257750c00902f141ea8-merged.mount: Deactivated successfully.
Oct  4 01:33:49 np0005470441 podman[222177]: 2025-10-04 05:33:49.35580602 +0000 UTC m=+0.076449877 container cleanup 0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 01:33:49 np0005470441 systemd[1]: libpod-conmon-0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b.scope: Deactivated successfully.
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.377 2 INFO nova.virt.libvirt.driver [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Instance destroyed successfully.#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.378 2 DEBUG nova.objects.instance [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lazy-loading 'resources' on Instance uuid e2fc47c6-9030-42b5-9a97-5c3c992f04a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.395 2 DEBUG nova.virt.libvirt.vif [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:33:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1687927062-access_point-1264103263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1687927062-ac',id=11,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKBOJjhX3sVJr/AcqYf1Kyiiq7grbD5SPm0FNcFfrb9TEEPE7/lsG5r9+3t5XixXI+9OaQd3W8glvARe/vIhhqJq/Jo2/EstUXHfW0Fz0J+PxhKephg3bVCE7I7vxMRAmQ==',key_name='tempest-TestSecurityGroupsBasicOps-2030986510',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:33:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b6962138a4941b79cb3bac6166c1b8a',ramdisk_id='',reservation_id='r-3xhrh88o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1687927062',owner_user_name='tempest-TestSecurityGroupsBasicOps-1687927062-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:33:22Z,user_data=None,user_id='9d287f06015a40ac9c85190888828f26',uuid=e2fc47c6-9030-42b5-9a97-5c3c992f04a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.395 2 DEBUG nova.network.os_vif_util [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Converting VIF {"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.197", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.396 2 DEBUG nova.network.os_vif_util [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.396 2 DEBUG os_vif [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.398 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8300d0dd-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:49 np0005470441 podman[222220]: 2025-10-04 05:33:49.414926863 +0000 UTC m=+0.038136547 container remove 0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.441 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1c60bb97-b718-48aa-9b7d-af1463675ef2]: (4, ('Sat Oct  4 05:33:49 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48 (0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b)\n0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b\nSat Oct  4 05:33:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48 (0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b)\n0288946d31f72ff2a5dd68f46ce42ca3ae883c0175863c4bcd86a310dd0be43b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.443 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4d56418a-9204-4de8-8d50-5ea4c5787d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.444 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c191ef1-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.446 2 INFO os_vif [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c1:b8:1d,bridge_name='br-int',has_traffic_filtering=True,id=8300d0dd-8dcc-4c1b-85a8-fe35e71cea24,network=Network(7c191ef1-10b4-48e9-a8f5-106ae79eac48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8300d0dd-8d')#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.447 2 INFO nova.virt.libvirt.driver [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Deleting instance files /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9_del#033[00m
Oct  4 01:33:49 np0005470441 kernel: tap7c191ef1-10: left promiscuous mode
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.447 2 INFO nova.virt.libvirt.driver [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Deletion of /var/lib/nova/instances/e2fc47c6-9030-42b5-9a97-5c3c992f04a9_del complete#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.453 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4c13156c-53f5-4ca2-9a5f-61f6d92e4c62]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.486 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[27f31b3e-9f78-4c6b-8bc1-969b4a4d62b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.488 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4079df49-617c-43cc-9099-36b3aa7bde89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.500 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a208bfa6-1422-4604-92ad-281e8fca95a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389547, 'reachable_time': 35400, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222235, 'error': None, 'target': 'ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.502 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7c191ef1-10b4-48e9-a8f5-106ae79eac48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:33:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:33:49.503 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d9f7bd-ce8d-4537-b58a-d185f32a5f24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:33:49 np0005470441 systemd[1]: run-netns-ovnmeta\x2d7c191ef1\x2d10b4\x2d48e9\x2da8f5\x2d106ae79eac48.mount: Deactivated successfully.
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.509 2 INFO nova.compute.manager [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.511 2 DEBUG oslo.service.loopingcall [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.512 2 DEBUG nova.compute.manager [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.512 2 DEBUG nova.network.neutron [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:33:49 np0005470441 nova_compute[192626]: 2025-10-04 05:33:49.743 2 INFO nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.700 2 DEBUG nova.network.neutron [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.727 2 INFO nova.compute.manager [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Took 1.22 seconds to deallocate network for instance.#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.749 2 INFO nova.virt.libvirt.driver [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.753 2 DEBUG nova.compute.manager [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.780 2 DEBUG nova.objects.instance [None req-a4b5f3a9-ea72-47fb-b99e-83f570d911dd 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.817 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.818 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.904 2 DEBUG nova.compute.provider_tree [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.922 2 DEBUG nova.scheduler.client.report [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.962 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:50 np0005470441 nova_compute[192626]: 2025-10-04 05:33:50.983 2 INFO nova.scheduler.client.report [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Deleted allocations for instance e2fc47c6-9030-42b5-9a97-5c3c992f04a9#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.050 2 DEBUG oslo_concurrency.lockutils [None req-a2315bcb-903b-4037-9e0f-e542d7f529d0 9d287f06015a40ac9c85190888828f26 5b6962138a4941b79cb3bac6166c1b8a - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.142 2 DEBUG nova.network.neutron [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updated VIF entry in instance network info cache for port 8300d0dd-8dcc-4c1b-85a8-fe35e71cea24. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.143 2 DEBUG nova.network.neutron [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Updating instance_info_cache with network_info: [{"id": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "address": "fa:16:3e:c1:b8:1d", "network": {"id": "7c191ef1-10b4-48e9-a8f5-106ae79eac48", "bridge": "br-int", "label": "tempest-network-smoke--434287495", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b6962138a4941b79cb3bac6166c1b8a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8300d0dd-8d", "ovs_interfaceid": "8300d0dd-8dcc-4c1b-85a8-fe35e71cea24", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.162 2 DEBUG oslo_concurrency.lockutils [req-5f6ed421-16dd-47e5-9e95-7fb228141dcc req-f40b29f4-e3df-4e73-b884-0f424ac135d4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-e2fc47c6-9030-42b5-9a97-5c3c992f04a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:33:51 np0005470441 podman[222237]: 2025-10-04 05:33:51.314752333 +0000 UTC m=+0.058814865 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.679 2 DEBUG nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-vif-unplugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.680 2 DEBUG oslo_concurrency.lockutils [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.680 2 DEBUG oslo_concurrency.lockutils [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.680 2 DEBUG oslo_concurrency.lockutils [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.681 2 DEBUG nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] No waiting events found dispatching network-vif-unplugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.681 2 WARNING nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received unexpected event network-vif-unplugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.681 2 DEBUG nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.681 2 DEBUG oslo_concurrency.lockutils [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.682 2 DEBUG oslo_concurrency.lockutils [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.682 2 DEBUG oslo_concurrency.lockutils [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e2fc47c6-9030-42b5-9a97-5c3c992f04a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.682 2 DEBUG nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] No waiting events found dispatching network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.682 2 WARNING nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received unexpected event network-vif-plugged-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.682 2 DEBUG nova.compute.manager [req-147a4b03-83b0-443f-ad27-ce7aa0e21960 req-e4e99381-cf36-4663-a26a-9cbe899eee70 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Received event network-vif-deleted-8300d0dd-8dcc-4c1b-85a8-fe35e71cea24 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:33:51 np0005470441 nova_compute[192626]: 2025-10-04 05:33:51.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:52 np0005470441 nova_compute[192626]: 2025-10-04 05:33:52.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:52 np0005470441 nova_compute[192626]: 2025-10-04 05:33:52.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:54 np0005470441 podman[222257]: 2025-10-04 05:33:54.286225232 +0000 UTC m=+0.043442737 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:33:54 np0005470441 nova_compute[192626]: 2025-10-04 05:33:54.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:55Z|00094|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.744 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.830 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.889 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.890 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:33:55Z|00095|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.956 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:55 np0005470441 nova_compute[192626]: 2025-10-04 05:33:55.961 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.023 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.024 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.091 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.251 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.252 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5511MB free_disk=73.4380111694336GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.252 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.253 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.365 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance b89756b5-b481-4ad9-aaf8-afda62b5d1bc actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.365 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 286feccf-0ffd-498c-8db5-7128a3d0f965 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.366 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.366 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.455 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.470 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.492 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:33:56 np0005470441 nova_compute[192626]: 2025-10-04 05:33:56.493 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:33:57 np0005470441 nova_compute[192626]: 2025-10-04 05:33:57.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:33:57 np0005470441 podman[222295]: 2025-10-04 05:33:57.296283768 +0000 UTC m=+0.047930755 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:33:57 np0005470441 nova_compute[192626]: 2025-10-04 05:33:57.494 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:33:57 np0005470441 nova_compute[192626]: 2025-10-04 05:33:57.494 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:33:57 np0005470441 nova_compute[192626]: 2025-10-04 05:33:57.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:33:57 np0005470441 nova_compute[192626]: 2025-10-04 05:33:57.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:33:57 np0005470441 nova_compute[192626]: 2025-10-04 05:33:57.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:33:58 np0005470441 nova_compute[192626]: 2025-10-04 05:33:58.384 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-b89756b5-b481-4ad9-aaf8-afda62b5d1bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:33:58 np0005470441 nova_compute[192626]: 2025-10-04 05:33:58.384 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-b89756b5-b481-4ad9-aaf8-afda62b5d1bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:33:58 np0005470441 nova_compute[192626]: 2025-10-04 05:33:58.384 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:33:58 np0005470441 nova_compute[192626]: 2025-10-04 05:33:58.385 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b89756b5-b481-4ad9-aaf8-afda62b5d1bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:33:59 np0005470441 podman[222315]: 2025-10-04 05:33:59.334907788 +0000 UTC m=+0.087420789 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:33:59 np0005470441 nova_compute[192626]: 2025-10-04 05:33:59.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:00 np0005470441 nova_compute[192626]: 2025-10-04 05:34:00.917 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Updating instance_info_cache with network_info: [{"id": "b0eb2882-c375-490a-9308-11da20a838e8", "address": "fa:16:3e:b3:d8:fc", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0eb2882-c3", "ovs_interfaceid": "b0eb2882-c375-490a-9308-11da20a838e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:00 np0005470441 nova_compute[192626]: 2025-10-04 05:34:00.962 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-b89756b5-b481-4ad9-aaf8-afda62b5d1bc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:34:00 np0005470441 nova_compute[192626]: 2025-10-04 05:34:00.962 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:34:00 np0005470441 nova_compute[192626]: 2025-10-04 05:34:00.963 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:00 np0005470441 nova_compute[192626]: 2025-10-04 05:34:00.963 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:00 np0005470441 nova_compute[192626]: 2025-10-04 05:34:00.963 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:01 np0005470441 nova_compute[192626]: 2025-10-04 05:34:01.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:02 np0005470441 nova_compute[192626]: 2025-10-04 05:34:02.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.708 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0a30d290b7ef45f3ade527507f03ce55', 'user_id': '174330e695c64fc1ac9d921e330c5642', 'hostId': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.711 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '0a30d290b7ef45f3ade527507f03ce55', 'user_id': '174330e695c64fc1ac9d921e330c5642', 'hostId': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.711 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.712 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>]
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.712 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.730 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.write.latency volume: 106014025 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.731 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.749 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.750 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cde1455b-65fe-4a80-9b85-9522292fad77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 106014025, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.712422', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc660872-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '88d31c589d62f2d41d06168ca1a9cfb52a6861fc682bc2a70646535a5ffa9c2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.712422', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc661506-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '94b810b7e17856c489e4588785d18b37e878db56c9a424dcbdf82a19f9268ce9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.712422', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc68f62c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '2bc7029516f3a456ddf430c0b4f4939eb3b6f0a420c189c188598881a09a5e38'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.712422', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc690126-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': 'dd53de5c3f6734dff3b1b70762ecf218ff5d6e8aedb2ce6405b7b0071fee106d'}]}, 'timestamp': '2025-10-04 05:34:02.750588', '_unique_id': 'e68c8fa194174c64b9e1bb10286d9b56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.751 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.752 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.755 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b89756b5-b481-4ad9-aaf8-afda62b5d1bc / tapb0eb2882-c3 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.755 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.incoming.bytes volume: 1006 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.757 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 286feccf-0ffd-498c-8db5-7128a3d0f965 / tape82ee3ec-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.757 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0794cfce-6a33-41c1-8879-a92b56f55a17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1006, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.752564', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc69d1be-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '4b9bf8f606dc0458f198698567f31fba8132a28846f0efdf8d5dc196ee5df639'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.752564', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc6a1c1e-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': 'a9a51ac5af592aa78f5e4bcc51cfb943359009c9c290eee85eb42e4809eb0640'}]}, 'timestamp': '2025-10-04 05:34:02.757759', '_unique_id': '18079babbdb240408dd6559c59bca43a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.758 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.759 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.759 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.759 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.759 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25e5418a-5ec7-41da-a7cf-bbc8618bf077', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.759233', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc6a5f44-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '452777519ae082cb14841648eb0c255e36bd4c46f28d2b0fc461d85faa15f6e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.759233', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc6a69d0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '267980846f790148974e395cf969cd749f73ddc29c49b61cfef94a65f9ed4b20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.759233', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc6a7240-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '3dc3b971c3a9aac3212cd44553cf4a170bf8342ae0717601f697055f0c537b6f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.759233', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc6a79ca-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '4d6f823ee756c3d60b6ffb08d8910c6c7fb529e4bef26848b7b75d1a519dfe2d'}]}, 'timestamp': '2025-10-04 05:34:02.760126', '_unique_id': 'acbab3e05de24124b3e7ee50c4fa9eea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.760 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.761 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.761 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.761 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd965f5d5-1684-48f7-8fc7-f8302add33e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.761283', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc6aaf58-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '6753ba0032f51d64ebdee198e7c44a61b0d9a3157351aeb3def370b74da995a6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.761283', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc6aba16-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': 'd79609f17054ba3a542b62989781c69e5e0aafa19aa7cec5d9f218ead497dc3a'}]}, 'timestamp': '2025-10-04 05:34:02.761787', '_unique_id': '06824fad9e374ef1b3d8d80705ffdf7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.762 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.outgoing.bytes volume: 5630 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e84c239-2599-4877-9b42-fbdebfe2c9aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5630, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.762909', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc6aeebe-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '39faf6fdbde16f6a9e3b16efc6e7683c8140b74d5acf461ae5497b707fb0d3c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.762909', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc6af6de-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '226f5f090b7b662b0dd51c2cefa110885d2505f77c87b41f4c0297fde7abbebf'}]}, 'timestamp': '2025-10-04 05:34:02.763336', '_unique_id': '5b64993fbd5d42eca3372be284458ea0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.763 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.764 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.764 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.764 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a751abf0-551e-4b8f-a567-79b46c7b5592', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.764420', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc6b2aaa-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '446127194fcbf7d29bc96a6cfc89c96c37a6ab7ed058b6d972f1e020d8fedffb'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.764420', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc6b32fc-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '358c763c18d112bf5356e449598d8882145364f78aa89a8d9a2ca539e73634dc'}]}, 'timestamp': '2025-10-04 05:34:02.764875', '_unique_id': '79f7d0b2f446418fae9c8c6f01f4f623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.765 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.766 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.766 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.766 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3e9625b1-b0e4-41b4-9515-ffca2f37b400', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.765951', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc6b6574-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '544c54c71c33888c0163b940613182db6b12e752106fb94091165c2e1f7aa4db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.765951', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc6b6d12-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '86b6c4f184f85fc7bc7ce73cbc7bc08c0de033759eee14f271146b1b99fcac11'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.765951', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc6b756e-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '9cdd7c8c20ada60e4e6e3b99405836ca2139fe2432703a1afc2382f8d4702ab7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.765951', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc6b7ea6-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '99566beee43ea8294bc24c1c8171517689309760b61c28d4939a124f929b217e'}]}, 'timestamp': '2025-10-04 05:34:02.766805', '_unique_id': '2f2e757973f845f8a4207f0f7c071f29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.767 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.782 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/memory.usage volume: 42.53125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.797 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.798 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 286feccf-0ffd-498c-8db5-7128a3d0f965: ceilometer.compute.pollsters.NoVolumeException
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e42080f6-b58a-4c8d-8b54-153858f68224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.53125, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'timestamp': '2025-10-04T05:34:02.767972', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'bc6df032-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.496786572, 'message_signature': '75259afc015f7944bfb5fd31a70857603f9cda6128733bb7ebaeacc3bbf727a2'}]}, 'timestamp': '2025-10-04 05:34:02.798249', '_unique_id': '0964b7fa32434d60940b3d5f52e5e8e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.799 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.800 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.800 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>]
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.800 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.800 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ad2fc87-d498-4ac8-a810-8e1c588a8d9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.800479', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc70aca0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '4dcef4d1278f9ddf40602351dd05e1fe23bebb7829b39028f3b28b05d3700097'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.800479', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc70b7e0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '8e6f8ae2a5999a6ac860d2167dec7d836becdee30947d5208cd07e7cd3a6d7d5'}]}, 'timestamp': '2025-10-04 05:34:02.801055', '_unique_id': '69d31b29acb64ce6bb546650498ad670'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.812 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.813 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.822 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.822 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e0f3cef-dee8-475e-84a7-08caf087a20f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.802311', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc72933a-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.516874564, 'message_signature': '350616b12422e9e850546157870eefc11e190e87223f05e39d3ff5e7aa42074d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.802311', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc729d1c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.516874564, 'message_signature': '903cbdca70309a6f026e39f036e18776b33c554a6f72bbfc5e4a8038c16cdae6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.802311', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc73f4c8-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.52798973, 'message_signature': 'f6c99b0956f5f604c28232a30e0c76e4eb0449f6d380b726c3953d63cda0a926'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.802311', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc73fe00-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.52798973, 'message_signature': '7e9fa6497dac436d49809a912338ebb49f38362966b4cf652d3b4a3597d32a0b'}]}, 'timestamp': '2025-10-04 05:34:02.822523', '_unique_id': '62f145b0e13d4d338162bcdb4e359053'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.823 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.824 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.write.requests volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.824 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.824 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.824 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e013911e-0523-4d9e-af71-1d5e6be96f6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 24, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.824041', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc7442de-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': 'e47dba6e0e591ae46930925724a8062e3c6e208cfc55a6436858f49e8e7631fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.824041', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc744ab8-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '9a30e029e17bae903b56fab8ecc3a51f2f18a2b1a4bf0b23af8a1e7729611343'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.824041', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc745490-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': 'f99a2fa37f90f553ae48d5d5095a9f485277a99ad17d96ce9cd21adf84fa923a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.824041', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc745e2c-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '6369d780466e716b1bc8cb38272d7eca1f357084d02133a8b4485026b6e02fd9'}]}, 'timestamp': '2025-10-04 05:34:02.824996', '_unique_id': 'fd43e20323c8485aa5ea2b977aa73af6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.825 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.826 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.826 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.826 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8defc3f2-cef0-4ab3-89e9-ccd900bbbf77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.826355', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc749df6-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '2ec57c207d87835285f5fbfa348b4b297a2283a008bfd59c11c40e6febe66afe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.826355', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc74a6ac-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '8f4d14b8914087399a0de7ce5f0c6f52e7c9b47a0ccf229c2f111d36bcd93145'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.826355', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc74ae4a-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '115c5d626a766bdef1915e72ebd3322ca7c09a1e62cfa3439ec4162fd7d2d45c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.826355', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc74b5a2-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': 'c511d4d06d8158dad83143db9e967ad80a9d82d07a97cbd8c333078d06aadd95'}]}, 'timestamp': '2025-10-04 05:34:02.827194', '_unique_id': 'c50efd4a4d0c41bca676a507602447d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.827 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.828 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.outgoing.packets volume: 81 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.828 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82abc0dc-e9a0-4199-8540-c5d2b3a30fd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 81, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.828314', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc74e9e6-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '79443db8aca4531d74e8a09360befe30dcecbc4c02a53429b48fe0afc31aafa4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.828314', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc74f3aa-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': 'c296f86387887c116cc2a23044ba65f6205a8b4c81d646b3ba522edcffaf4a1b'}]}, 'timestamp': '2025-10-04 05:34:02.828791', '_unique_id': 'a4db583006a44cc8996202d6bda01c94'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.829 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2928dc-be9d-4189-bfa8-b61f8ba69090', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.829904', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc7528de-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': 'a303e8db91c2240eabbf5bf7b327a8d08357f6a2d982d375cdcc1541467fe92f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.829904', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc753180-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '46c1c8a61cc5d00ebf865070b989b63bd30edf5f7710233c458f6249a296e1ad'}]}, 'timestamp': '2025-10-04 05:34:02.830402', '_unique_id': '21893b1378d64a5d8f947451acd085e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.830 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.831 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.831 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.write.bytes volume: 184320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.831 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.831 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.832 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e178a6b9-0a70-4fbf-a649-55e807228003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 184320, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.831488', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc756740-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': '483e4753f180b60749d0fded736aa6e771d0b28f7ae824006f37ae5a2a40422a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.831488', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc756f38-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.426950154, 'message_signature': 'fb130d11fcb5c930af6cab16ea1abf31b8bb12325e7a5879238e52e5af3326bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.831488', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc7576ae-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '5a351c5d2551a24b6fd1567ff49ff548e9af2a9bd5cc58df6223ed085c2b5c2f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.831488', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc758054-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.445939695, 'message_signature': '3c7b6bec3cb8f6d01dffc4f083112f16475229a9043a8b9daaca9c3f6af26220'}]}, 'timestamp': '2025-10-04 05:34:02.832416', '_unique_id': '95a7734046c34bc885002f66447a542b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.833 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.834 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>]
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.834 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.834 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.834 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06fe3f62-e4ba-4c30-ace0-f5f069563b3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.834337', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc75d928-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': '37102fbc1e2a1ac1e99827b29b2675076be1dac7c5812b23f853777bcf6c7164'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.834337', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc75e4d6-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '00e4098e1e23c927c649c2335bd8e8ee62dc263ac911a291642be6273f5156b3'}]}, 'timestamp': '2025-10-04 05:34:02.835012', '_unique_id': '9efa2cc4f3534ca8b89d35b10688b632'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.835 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.836 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.836 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-155332389>, <NovaLikeServer: tempest-LiveAutoBlockMigrationV225Test-server-1864739972>]
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.836 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.836 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.836 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6869ef0a-c83d-4ac2-99cf-7cf7f0105181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.836653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc763012-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': 'c2055d3b6c5af8150e178db530966edd779903126e960a915e4fbd346bddf372'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.836653', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc763882-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '9e2465a75dfd68a438706fad470665d646d738691578e04407a9279967d23350'}]}, 'timestamp': '2025-10-04 05:34:02.837105', '_unique_id': '1385b9b47d744e5b935855f7ac5f3fa6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.837 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.838 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.838 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.838 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43c05f35-3d78-4a02-b76d-4ec605e61397', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-00000003-b89756b5-b481-4ad9-aaf8-afda62b5d1bc-tapb0eb2882-c3', 'timestamp': '2025-10-04T05:34:02.838648', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'tapb0eb2882-c3', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b3:d8:fc', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb0eb2882-c3'}, 'message_id': 'bc767e82-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.467147419, 'message_signature': 'b806cf415967287f0d76d74bf5d2a2b26ac0cdeea0ffdc9598f5652d986e973e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'instance-0000000a-286feccf-0ffd-498c-8db5-7128a3d0f965-tape82ee3ec-eb', 'timestamp': '2025-10-04T05:34:02.838648', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'tape82ee3ec-eb', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:8d:c7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape82ee3ec-eb'}, 'message_id': 'bc768a30-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.470395651, 'message_signature': '9b1080033c952c4410d38b5fdd4720a410f5fb0213c70f014c9174e95ddcd36a'}]}, 'timestamp': '2025-10-04 05:34:02.839237', '_unique_id': 'e5cd4ed9016740feba8054e0ef10c89b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.839 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.840 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.840 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.840 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.841 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.841 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd20de834-9d7e-4960-ae6d-d45e60d03379', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.840655', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc76cce8-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.516874564, 'message_signature': '6e424538304ee994c7611c4d9109b8c44a193b5c411140856c9ffe9f6e8e0fb6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.840655', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc76d918-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.516874564, 'message_signature': '3d80b169dc2a61e59c045e1730fe413e81a97b82f406b9edd022e2c5cced75b2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.840655', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc76e386-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.52798973, 'message_signature': '09158023be4f9617e2bc4a8f155534a5bff0c230474de43d17cd2359a44b4eee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.840655', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc76ebec-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.52798973, 'message_signature': 'fd0f3c8c05d073f566e2e3d00e34ae4c160d01fe1e990f39736bc47fa3772a25'}]}, 'timestamp': '2025-10-04 05:34:02.841691', '_unique_id': '5064f1f1cd8c44c397512a4aa3f2ba06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.842 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.allocation volume: 30412800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.843 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.843 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.843 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '34869ba2-d52b-4676-9e2a-9d88f6fdf6b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30412800, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-vda', 'timestamp': '2025-10-04T05:34:02.842860', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc772184-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.516874564, 'message_signature': 'cea073cfbce7d50728d193e0f725dab0cd10d361c5c28bd9967b028f759d1a7d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc-sda', 'timestamp': '2025-10-04T05:34:02.842860', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc7728f0-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.516874564, 'message_signature': '9a5707a791fe9df0d6aa6b341e665569e509b45cab717b60d22ff373ad3c5fbc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-vda', 'timestamp': '2025-10-04T05:34:02.842860', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'bc7730a2-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.52798973, 'message_signature': '31dfb33fea0569c851f693998ce0220dc7ccdd44b35bf0d8a02cf21c958109cb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965-sda', 'timestamp': '2025-10-04T05:34:02.842860', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'bc773872-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.52798973, 'message_signature': '3c80a7d76ea13f55bd71891972c79989ae96d2ec9ef75288db10353a7cb3d6fa'}]}, 'timestamp': '2025-10-04 05:34:02.843647', '_unique_id': 'c59cabd8b6fe46dfa8eb38e4de9cd241'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 DEBUG ceilometer.compute.pollsters [-] b89756b5-b481-4ad9-aaf8-afda62b5d1bc/cpu volume: 330000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.844 12 DEBUG ceilometer.compute.pollsters [-] 286feccf-0ffd-498c-8db5-7128a3d0f965/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51a5e2d9-910d-40a0-9131-b64e70ed1adc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 330000000, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'timestamp': '2025-10-04T05:34:02.844738', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-155332389', 'name': 'instance-00000003', 'instance_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bc776b80-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.496786572, 'message_signature': 'df072f9587970595f745512c408a432e943d83183e358bb5c7aef41a07c3fc86'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '174330e695c64fc1ac9d921e330c5642', 'user_name': None, 'project_id': '0a30d290b7ef45f3ade527507f03ce55', 'project_name': None, 'resource_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'timestamp': '2025-10-04T05:34:02.844738', 'resource_metadata': {'display_name': 'tempest-LiveAutoBlockMigrationV225Test-server-1864739972', 'name': 'instance-0000000a', 'instance_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'instance_type': 'm1.nano', 'host': '2def2388d6dc7dd5f840aec73ddcd666104b114e82b720cb0b6a750e', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'bc777314-a0e3-11f0-8814-fa163ed2379c', 'monotonic_time': 3940.51216514, 'message_signature': '09fd7f92958749de2ff394072ea6ef289463d56b60f5ffea02585238f134b3f7'}]}, 'timestamp': '2025-10-04 05:34:02.845149', '_unique_id': '1036fcbddff34521a3d2c7ac06137f20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:34:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:34:02.845 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:34:04 np0005470441 nova_compute[192626]: 2025-10-04 05:34:04.374 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556029.3737688, e2fc47c6-9030-42b5-9a97-5c3c992f04a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:04 np0005470441 nova_compute[192626]: 2025-10-04 05:34:04.374 2 INFO nova.compute.manager [-] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:34:04 np0005470441 nova_compute[192626]: 2025-10-04 05:34:04.414 2 DEBUG nova.compute.manager [None req-629365ef-df02-4391-9475-3c0bdb682679 - - - - - -] [instance: e2fc47c6-9030-42b5-9a97-5c3c992f04a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:04 np0005470441 nova_compute[192626]: 2025-10-04 05:34:04.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:06.214 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:06 np0005470441 nova_compute[192626]: 2025-10-04 05:34:06.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:06.215 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:34:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:06.738 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:06.738 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:06.739 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:07 np0005470441 nova_compute[192626]: 2025-10-04 05:34:07.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:09 np0005470441 podman[222342]: 2025-10-04 05:34:09.293305271 +0000 UTC m=+0.047862783 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:34:09 np0005470441 podman[222341]: 2025-10-04 05:34:09.293379453 +0000 UTC m=+0.051763134 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Oct  4 01:34:09 np0005470441 nova_compute[192626]: 2025-10-04 05:34:09.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:12 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:12.217 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:12 np0005470441 nova_compute[192626]: 2025-10-04 05:34:12.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:14 np0005470441 podman[222385]: 2025-10-04 05:34:14.29946255 +0000 UTC m=+0.053437512 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:34:14 np0005470441 podman[222386]: 2025-10-04 05:34:14.311612636 +0000 UTC m=+0.060770691 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  4 01:34:14 np0005470441 nova_compute[192626]: 2025-10-04 05:34:14.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:17 np0005470441 nova_compute[192626]: 2025-10-04 05:34:17.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:19 np0005470441 nova_compute[192626]: 2025-10-04 05:34:19.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:21 np0005470441 nova_compute[192626]: 2025-10-04 05:34:21.053 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Creating tmpfile /var/lib/nova/instances/tmpkl5_lxfs to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Oct  4 01:34:21 np0005470441 nova_compute[192626]: 2025-10-04 05:34:21.054 2 DEBUG nova.compute.manager [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkl5_lxfs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Oct  4 01:34:22 np0005470441 nova_compute[192626]: 2025-10-04 05:34:22.233 2 DEBUG nova.compute.manager [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkl5_lxfs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d4747ed-5583-49d0-bc11-6ea6be7e8a5f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Oct  4 01:34:22 np0005470441 nova_compute[192626]: 2025-10-04 05:34:22.265 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "refresh_cache-3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:34:22 np0005470441 nova_compute[192626]: 2025-10-04 05:34:22.265 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquired lock "refresh_cache-3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:34:22 np0005470441 nova_compute[192626]: 2025-10-04 05:34:22.266 2 DEBUG nova.network.neutron [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:34:22 np0005470441 podman[222422]: 2025-10-04 05:34:22.300347253 +0000 UTC m=+0.054667517 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6)
Oct  4 01:34:22 np0005470441 nova_compute[192626]: 2025-10-04 05:34:22.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.834 2 DEBUG nova.network.neutron [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Updating instance_info_cache with network_info: [{"id": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "address": "fa:16:3e:e1:b2:19", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d83b25-e4", "ovs_interfaceid": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.871 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Releasing lock "refresh_cache-3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.872 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkl5_lxfs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d4747ed-5583-49d0-bc11-6ea6be7e8a5f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.873 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Creating instance directory: /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.873 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Creating disk.info with the contents: {'/var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk': 'qcow2', '/var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.873 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.874 2 DEBUG nova.objects.instance [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.900 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.953 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.954 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.955 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:24 np0005470441 nova_compute[192626]: 2025-10-04 05:34:24.970 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.055 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.056 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.094 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.095 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.096 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.168 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.169 2 DEBUG nova.virt.disk.api [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Checking if we can resize image /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.170 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.222 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.223 2 DEBUG nova.virt.disk.api [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Cannot resize image /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.224 2 DEBUG nova.objects.instance [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lazy-loading 'migration_context' on Instance uuid 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.237 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.260 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.262 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk.config to /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.262 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk.config /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:25 np0005470441 podman[222458]: 2025-10-04 05:34:25.310227515 +0000 UTC m=+0.062527660 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.781 2 DEBUG oslo_concurrency.processutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f/disk.config /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.782 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.783 2 DEBUG nova.virt.libvirt.vif [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:34:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1570737566',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1570737566',id=14,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:34:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0a30d290b7ef45f3ade527507f03ce55',ramdisk_id='',reservation_id='r-z0dsngdn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-869616',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-869616-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:34:15Z,user_data=None,user_id='174330e695c64fc1ac9d921e330c5642',uuid=3d4747ed-5583-49d0-bc11-6ea6be7e8a5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "address": "fa:16:3e:e1:b2:19", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc1d83b25-e4", "ovs_interfaceid": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.784 2 DEBUG nova.network.os_vif_util [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Converting VIF {"id": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "address": "fa:16:3e:e1:b2:19", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc1d83b25-e4", "ovs_interfaceid": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.785 2 DEBUG nova.network.os_vif_util [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b2:19,bridge_name='br-int',has_traffic_filtering=True,id=c1d83b25-e45b-4ecf-b8ba-f74235147b5a,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc1d83b25-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.785 2 DEBUG os_vif [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b2:19,bridge_name='br-int',has_traffic_filtering=True,id=c1d83b25-e45b-4ecf-b8ba-f74235147b5a,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc1d83b25-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.786 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.787 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.789 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1d83b25-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.790 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc1d83b25-e4, col_values=(('external_ids', {'iface-id': 'c1d83b25-e45b-4ecf-b8ba-f74235147b5a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:b2:19', 'vm-uuid': '3d4747ed-5583-49d0-bc11-6ea6be7e8a5f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:25 np0005470441 NetworkManager[51690]: <info>  [1759556065.7925] manager: (tapc1d83b25-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.799 2 INFO os_vif [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:b2:19,bridge_name='br-int',has_traffic_filtering=True,id=c1d83b25-e45b-4ecf-b8ba-f74235147b5a,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc1d83b25-e4')#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.799 2 DEBUG nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Oct  4 01:34:25 np0005470441 nova_compute[192626]: 2025-10-04 05:34:25.800 2 DEBUG nova.compute.manager [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkl5_lxfs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d4747ed-5583-49d0-bc11-6ea6be7e8a5f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Oct  4 01:34:27 np0005470441 nova_compute[192626]: 2025-10-04 05:34:27.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:27 np0005470441 nova_compute[192626]: 2025-10-04 05:34:27.768 2 DEBUG nova.network.neutron [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Port c1d83b25-e45b-4ecf-b8ba-f74235147b5a updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Oct  4 01:34:27 np0005470441 nova_compute[192626]: 2025-10-04 05:34:27.770 2 DEBUG nova.compute.manager [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpkl5_lxfs',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3d4747ed-5583-49d0-bc11-6ea6be7e8a5f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Oct  4 01:34:27 np0005470441 podman[222490]: 2025-10-04 05:34:27.792633235 +0000 UTC m=+0.046948637 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:34:28 np0005470441 kernel: tapc1d83b25-e4: entered promiscuous mode
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:28Z|00096|binding|INFO|Claiming lport c1d83b25-e45b-4ecf-b8ba-f74235147b5a for this additional chassis.
Oct  4 01:34:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:28Z|00097|binding|INFO|c1d83b25-e45b-4ecf-b8ba-f74235147b5a: Claiming fa:16:3e:e1:b2:19 10.100.0.12
Oct  4 01:34:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:28Z|00098|binding|INFO|Claiming lport cc11434d-3b70-44c4-aeaa-3c368418a6c1 for this additional chassis.
Oct  4 01:34:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:28Z|00099|binding|INFO|cc11434d-3b70-44c4-aeaa-3c368418a6c1: Claiming fa:16:3e:75:61:97 19.80.0.147
Oct  4 01:34:28 np0005470441 NetworkManager[51690]: <info>  [1759556068.0652] manager: (tapc1d83b25-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Oct  4 01:34:28 np0005470441 systemd-udevd[222523]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:34:28 np0005470441 NetworkManager[51690]: <info>  [1759556068.1010] device (tapc1d83b25-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:34:28 np0005470441 NetworkManager[51690]: <info>  [1759556068.1019] device (tapc1d83b25-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:34:28 np0005470441 systemd-machined[152624]: New machine qemu-7-instance-0000000e.
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:28Z|00100|binding|INFO|Setting lport c1d83b25-e45b-4ecf-b8ba-f74235147b5a ovn-installed in OVS
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:28 np0005470441 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.767 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.769 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.799 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.884 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.885 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.893 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:34:28 np0005470441 nova_compute[192626]: 2025-10-04 05:34:28.893 2 INFO nova.compute.claims [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.086 2 DEBUG nova.compute.provider_tree [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.100 2 DEBUG nova.scheduler.client.report [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.153 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.154 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.228 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.228 2 DEBUG nova.network.neutron [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.239 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556069.2393863, 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.240 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] VM Started (Lifecycle Event)#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.253 2 INFO nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.256 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.270 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.381 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.382 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.383 2 INFO nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Creating image(s)#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.383 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "/var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.384 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.385 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.398 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.484 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.485 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.486 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.496 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.514 2 DEBUG nova.policy [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.553 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.554 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.602 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.603 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.603 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.658 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.659 2 DEBUG nova.virt.disk.api [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Checking if we can resize image /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.659 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.716 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.717 2 DEBUG nova.virt.disk.api [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Cannot resize image /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.717 2 DEBUG nova.objects.instance [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 1701a941-088f-4d8d-99a0-3ab59e08de62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.742 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.743 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Ensure instance console log exists: /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.743 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.743 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:29 np0005470441 nova_compute[192626]: 2025-10-04 05:34:29.743 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:30 np0005470441 nova_compute[192626]: 2025-10-04 05:34:30.120 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556070.1201057, 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:30 np0005470441 nova_compute[192626]: 2025-10-04 05:34:30.120 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:34:30 np0005470441 nova_compute[192626]: 2025-10-04 05:34:30.176 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:30 np0005470441 nova_compute[192626]: 2025-10-04 05:34:30.180 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:34:30 np0005470441 podman[222570]: 2025-10-04 05:34:30.349465464 +0000 UTC m=+0.102560810 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:34:30 np0005470441 nova_compute[192626]: 2025-10-04 05:34:30.437 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Oct  4 01:34:30 np0005470441 nova_compute[192626]: 2025-10-04 05:34:30.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:31 np0005470441 nova_compute[192626]: 2025-10-04 05:34:31.698 2 DEBUG nova.network.neutron [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Successfully created port: e049c33e-0d6a-464d-99f6-ec92be78f298 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:34:32 np0005470441 nova_compute[192626]: 2025-10-04 05:34:32.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:32Z|00101|binding|INFO|Claiming lport c1d83b25-e45b-4ecf-b8ba-f74235147b5a for this chassis.
Oct  4 01:34:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:32Z|00102|binding|INFO|c1d83b25-e45b-4ecf-b8ba-f74235147b5a: Claiming fa:16:3e:e1:b2:19 10.100.0.12
Oct  4 01:34:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:32Z|00103|binding|INFO|Claiming lport cc11434d-3b70-44c4-aeaa-3c368418a6c1 for this chassis.
Oct  4 01:34:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:32Z|00104|binding|INFO|cc11434d-3b70-44c4-aeaa-3c368418a6c1: Claiming fa:16:3e:75:61:97 19.80.0.147
Oct  4 01:34:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:32Z|00105|binding|INFO|Setting lport c1d83b25-e45b-4ecf-b8ba-f74235147b5a up in Southbound
Oct  4 01:34:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:32Z|00106|binding|INFO|Setting lport cc11434d-3b70-44c4-aeaa-3c368418a6c1 up in Southbound
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.004 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:b2:19 10.100.0.12'], port_security=['fa:16:3e:e1:b2:19 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-882931926', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3d4747ed-5583-49d0-bc11-6ea6be7e8a5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a984030f-c569-4bd0-83e0-9a6812d06f48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-882931926', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '11', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfeb7d9d-6193-40b2-b586-fa0e6ac8f060, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=c1d83b25-e45b-4ecf-b8ba-f74235147b5a) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.006 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:61:97 19.80.0.147'], port_security=['fa:16:3e:75:61:97 19.80.0.147'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['c1d83b25-e45b-4ecf-b8ba-f74235147b5a'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1026755307', 'neutron:cidrs': '19.80.0.147/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b29c042d-5444-48d3-95fd-56933a7e65a1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1026755307', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '3', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=c8b5faf8-cac7-402d-ac27-5352e16c3484, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc11434d-3b70-44c4-aeaa-3c368418a6c1) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.006 103689 INFO neutron.agent.ovn.metadata.agent [-] Port c1d83b25-e45b-4ecf-b8ba-f74235147b5a in datapath a984030f-c569-4bd0-83e0-9a6812d06f48 bound to our chassis#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.008 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a984030f-c569-4bd0-83e0-9a6812d06f48#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.027 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2ec9a8a5-4d35-4ed1-990c-1f6d18f5af07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.065 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8da1ae-ee8d-47d3-a184-723aba97fe15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.068 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[7f68d266-e02a-47b0-9b86-243d1e2e09e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.101 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[288f4b6f-8c16-4990-bdae-8f053fe4ddb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.122 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d38ae0ff-930a-42d6-acf3-e7ea9830a220]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa984030f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:44:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 24, 'tx_packets': 7, 'rx_bytes': 1504, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 24, 'tx_packets': 7, 'rx_bytes': 1504, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386672, 'reachable_time': 33947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222601, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.141 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[766adde8-1371-4300-b6c7-643cae7cad30]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386682, 'tstamp': 386682}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222602, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386685, 'tstamp': 386685}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222602, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.142 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa984030f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.145 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa984030f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.145 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.146 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa984030f-c0, col_values=(('external_ids', {'iface-id': '3ea6f406-5ff7-4b46-9301-f23ee9be4b86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.146 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.147 103689 INFO neutron.agent.ovn.metadata.agent [-] Port cc11434d-3b70-44c4-aeaa-3c368418a6c1 in datapath b29c042d-5444-48d3-95fd-56933a7e65a1 unbound from our chassis#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.149 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b29c042d-5444-48d3-95fd-56933a7e65a1#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.162 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c7e9d6ea-fb28-4dc1-beb5-71a05baa2103]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.163 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb29c042d-51 in ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.165 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb29c042d-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.165 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[373eac87-e9ad-4fd3-b012-d41d234c8e47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.166 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f47d447-d76b-4d41-84d4-5fa774821787]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.177 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[7dd22f81-bc37-4c27-8204-b8120ece5f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.195 2 INFO nova.compute.manager [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Post operation of migration started#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.200 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ac5e0070-9bb7-47ba-9624-0c2c1d0700f5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.237 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[841654cc-efc6-40af-9277-dff58d0e0b6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.251 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d13dbd13-4d3d-49d8-bae0-92e07a0ba854]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 NetworkManager[51690]: <info>  [1759556073.2547] manager: (tapb29c042d-50): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Oct  4 01:34:33 np0005470441 systemd-udevd[222610]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.290 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[3d93466f-d549-41a1-981e-6a518014a28d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.295 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[5013a29a-4a4f-442f-857b-aaae878d1c49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 NetworkManager[51690]: <info>  [1759556073.3238] device (tapb29c042d-50): carrier: link connected
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.328 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2e572c-694b-452a-a64d-23faa182d7eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.342 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[426ac147-7467-44c0-a209-a011edef7449]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb29c042d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:2c:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397097, 'reachable_time': 37110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222629, 'error': None, 'target': 'ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.357 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[11e1dc9c-14cd-40c4-b92b-1ad08df801d0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:2c49'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397097, 'tstamp': 397097}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222630, 'error': None, 'target': 'ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.371 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2caddb-540c-4701-b186-69995e7582f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb29c042d-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:2c:49'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397097, 'reachable_time': 37110, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222631, 'error': None, 'target': 'ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.397 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[10f9370f-7159-4176-906d-472db8ad2005]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.448 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[43b5013f-d8aa-4373-99b0-b7dcd877f7e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.449 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb29c042d-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.449 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.450 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb29c042d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:33 np0005470441 NetworkManager[51690]: <info>  [1759556073.5006] manager: (tapb29c042d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct  4 01:34:33 np0005470441 kernel: tapb29c042d-50: entered promiscuous mode
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.503 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb29c042d-50, col_values=(('external_ids', {'iface-id': 'd46606a2-6020-4260-95f5-96b5bff72200'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:33Z|00107|binding|INFO|Releasing lport d46606a2-6020-4260-95f5-96b5bff72200 from this chassis (sb_readonly=0)
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.517 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b29c042d-5444-48d3-95fd-56933a7e65a1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b29c042d-5444-48d3-95fd-56933a7e65a1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.518 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e98ad9c3-3335-463b-8e81-72f69266c9ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.519 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-b29c042d-5444-48d3-95fd-56933a7e65a1
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/b29c042d-5444-48d3-95fd-56933a7e65a1.pid.haproxy
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID b29c042d-5444-48d3-95fd-56933a7e65a1
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:34:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:33.519 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1', 'env', 'PROCESS_TAG=haproxy-b29c042d-5444-48d3-95fd-56933a7e65a1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b29c042d-5444-48d3-95fd-56933a7e65a1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.693 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "refresh_cache-3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.693 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquired lock "refresh_cache-3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.694 2 DEBUG nova.network.neutron [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.699 2 DEBUG nova.network.neutron [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Successfully updated port: e049c33e-0d6a-464d-99f6-ec92be78f298 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.808 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.809 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquired lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:34:33 np0005470441 nova_compute[192626]: 2025-10-04 05:34:33.809 2 DEBUG nova.network.neutron [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:34:33 np0005470441 podman[222664]: 2025-10-04 05:34:33.868899007 +0000 UTC m=+0.048278715 container create a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3)
Oct  4 01:34:33 np0005470441 systemd[1]: Started libpod-conmon-a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec.scope.
Oct  4 01:34:33 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:34:33 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c801eb83d91db244270e1a9e6dda082e6c81da0c9688fec309758c22e6a9ae7f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:34:33 np0005470441 podman[222664]: 2025-10-04 05:34:33.844708969 +0000 UTC m=+0.024088677 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:34:33 np0005470441 podman[222664]: 2025-10-04 05:34:33.944313813 +0000 UTC m=+0.123693541 container init a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  4 01:34:33 np0005470441 podman[222664]: 2025-10-04 05:34:33.949983345 +0000 UTC m=+0.129363053 container start a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:34:33 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [NOTICE]   (222683) : New worker (222685) forked
Oct  4 01:34:33 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [NOTICE]   (222683) : Loading success.
Oct  4 01:34:34 np0005470441 nova_compute[192626]: 2025-10-04 05:34:34.389 2 DEBUG nova.network.neutron [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.614 2 DEBUG nova.network.neutron [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updating instance_info_cache with network_info: [{"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.647 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Releasing lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.647 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Instance network_info: |[{"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.650 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Start _get_guest_xml network_info=[{"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.656 2 WARNING nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.663 2 DEBUG nova.virt.libvirt.host [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.665 2 DEBUG nova.virt.libvirt.host [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.668 2 DEBUG nova.virt.libvirt.host [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.669 2 DEBUG nova.virt.libvirt.host [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.670 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.670 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.671 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.671 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.672 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.672 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.672 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.673 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.673 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.673 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.674 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.674 2 DEBUG nova.virt.hardware [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.678 2 DEBUG nova.virt.libvirt.vif [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:34:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=15,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDX2VyBFoMVwGa9zc0KZ9bhtXhBJhCrhuTKG5hmyU8IpbM0X+Ixaq5iKC9FhofbzRwnAp3tguUaWpO8P70VfYAFlVnPOkMqXbMwvDxh4gX9vBJzXTSsko73Fx1Ona2RWYQ==',key_name='tempest-TestSecurityGroupsBasicOps-413146072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-ixhib040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:34:29Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=1701a941-088f-4d8d-99a0-3ab59e08de62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.679 2 DEBUG nova.network.os_vif_util [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.680 2 DEBUG nova.network.os_vif_util [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.681 2 DEBUG nova.objects.instance [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1701a941-088f-4d8d-99a0-3ab59e08de62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.701 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <uuid>1701a941-088f-4d8d-99a0-3ab59e08de62</uuid>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <name>instance-0000000f</name>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620</nova:name>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:34:35</nova:creationTime>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:user uuid="560c2ee221db4d87b04080584e8f0a48">tempest-TestSecurityGroupsBasicOps-1075539829-project-member</nova:user>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:project uuid="2eaa5fc2c08b415c8c98103e044fc0a3">tempest-TestSecurityGroupsBasicOps-1075539829</nova:project>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        <nova:port uuid="e049c33e-0d6a-464d-99f6-ec92be78f298">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <entry name="serial">1701a941-088f-4d8d-99a0-3ab59e08de62</entry>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <entry name="uuid">1701a941-088f-4d8d-99a0-3ab59e08de62</entry>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.config"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:0c:3c:e1"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <target dev="tape049c33e-0d"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/console.log" append="off"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:34:35 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:34:35 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:34:35 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:34:35 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.702 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Preparing to wait for external event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.703 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.703 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.703 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.704 2 DEBUG nova.virt.libvirt.vif [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:34:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=15,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDX2VyBFoMVwGa9zc0KZ9bhtXhBJhCrhuTKG5hmyU8IpbM0X+Ixaq5iKC9FhofbzRwnAp3tguUaWpO8P70VfYAFlVnPOkMqXbMwvDxh4gX9vBJzXTSsko73Fx1Ona2RWYQ==',key_name='tempest-TestSecurityGroupsBasicOps-413146072',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-ixhib040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:34:29Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=1701a941-088f-4d8d-99a0-3ab59e08de62,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.704 2 DEBUG nova.network.os_vif_util [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.705 2 DEBUG nova.network.os_vif_util [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.705 2 DEBUG os_vif [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.706 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.706 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape049c33e-0d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape049c33e-0d, col_values=(('external_ids', {'iface-id': 'e049c33e-0d6a-464d-99f6-ec92be78f298', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:3c:e1', 'vm-uuid': '1701a941-088f-4d8d-99a0-3ab59e08de62'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:35 np0005470441 NetworkManager[51690]: <info>  [1759556075.7125] manager: (tape049c33e-0d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.718 2 INFO os_vif [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d')#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.780 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.781 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.781 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No VIF found with MAC fa:16:3e:0c:3c:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.781 2 INFO nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Using config drive#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.789 2 DEBUG nova.network.neutron [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Updating instance_info_cache with network_info: [{"id": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "address": "fa:16:3e:e1:b2:19", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d83b25-e4", "ovs_interfaceid": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.811 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Releasing lock "refresh_cache-3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.832 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.833 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.833 2 DEBUG oslo_concurrency.lockutils [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:35 np0005470441 nova_compute[192626]: 2025-10-04 05:34:35.839 2 INFO nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Oct  4 01:34:35 np0005470441 virtqemud[192168]: Domain id=7 name='instance-0000000e' uuid=3d4747ed-5583-49d0-bc11-6ea6be7e8a5f is tainted: custom-monitor
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.410 2 INFO nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Creating config drive at /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.config#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.416 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpics_rqzw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.498 2 DEBUG nova.compute.manager [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-changed-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.498 2 DEBUG nova.compute.manager [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Refreshing instance network info cache due to event network-changed-e049c33e-0d6a-464d-99f6-ec92be78f298. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.499 2 DEBUG oslo_concurrency.lockutils [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.499 2 DEBUG oslo_concurrency.lockutils [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.499 2 DEBUG nova.network.neutron [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Refreshing network info cache for port e049c33e-0d6a-464d-99f6-ec92be78f298 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.540 2 DEBUG oslo_concurrency.processutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpics_rqzw" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:36 np0005470441 kernel: tape049c33e-0d: entered promiscuous mode
Oct  4 01:34:36 np0005470441 NetworkManager[51690]: <info>  [1759556076.5935] manager: (tape049c33e-0d): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:36Z|00108|binding|INFO|Claiming lport e049c33e-0d6a-464d-99f6-ec92be78f298 for this chassis.
Oct  4 01:34:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:36Z|00109|binding|INFO|e049c33e-0d6a-464d-99f6-ec92be78f298: Claiming fa:16:3e:0c:3c:e1 10.100.0.13
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.610 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:3c:e1 10.100.0.13'], port_security=['fa:16:3e:0c:3c:e1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1701a941-088f-4d8d-99a0-3ab59e08de62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77799376-2a50-4044-a8c2-dc7e983782a3 e6007abc-b196-4efc-89a1-b67345655b56', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7246ef1-86de-4684-8576-21e21bc385cc, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=e049c33e-0d6a-464d-99f6-ec92be78f298) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.613 103689 INFO neutron.agent.ovn.metadata.agent [-] Port e049c33e-0d6a-464d-99f6-ec92be78f298 in datapath 5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 bound to our chassis#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.614 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.623 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4c56530e-dbde-4649-98b7-f03cd39be521]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.624 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5097ec2e-e1 in ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.625 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5097ec2e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.625 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9265bcdd-a061-4598-ba3c-166c93e0af18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 systemd-udevd[222714]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.626 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3878f4a1-13fb-4173-b59d-fa442e16bcc5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 systemd-machined[152624]: New machine qemu-8-instance-0000000f.
Oct  4 01:34:36 np0005470441 NetworkManager[51690]: <info>  [1759556076.6362] device (tape049c33e-0d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:34:36 np0005470441 NetworkManager[51690]: <info>  [1759556076.6372] device (tape049c33e-0d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.639 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d23374-722e-4084-8e36-a176b789842d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:36 np0005470441 systemd[1]: Started Virtual Machine qemu-8-instance-0000000f.
Oct  4 01:34:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:36Z|00110|binding|INFO|Setting lport e049c33e-0d6a-464d-99f6-ec92be78f298 ovn-installed in OVS
Oct  4 01:34:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:36Z|00111|binding|INFO|Setting lport e049c33e-0d6a-464d-99f6-ec92be78f298 up in Southbound
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.668 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d0a1274f-9e57-437e-94d2-78633a8ab4a8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.694 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[5b657ed0-990b-426c-9130-01cb17f041e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.698 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d587b88e-aadc-4c4d-bf2d-f66468271e9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 systemd-udevd[222718]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:34:36 np0005470441 NetworkManager[51690]: <info>  [1759556076.6996] manager: (tap5097ec2e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/58)
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.731 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[2d50891e-dc70-4d20-9f07-157639510035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.734 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[309b7efd-a401-498d-9b9d-2f7bd4274cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 NetworkManager[51690]: <info>  [1759556076.7611] device (tap5097ec2e-e0): carrier: link connected
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.766 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[7f87c490-b499-44ba-b19a-c8ae8de26520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.781 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2d40426f-a54b-4492-b4e1-a4616f6a1068]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5097ec2e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:c4:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397441, 'reachable_time': 27396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222747, 'error': None, 'target': 'ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.794 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2def5417-9285-4ebb-8695-d811ed35d28c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9c:c491'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 397441, 'tstamp': 397441}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222748, 'error': None, 'target': 'ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.808 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[911bd221-7eeb-4af8-a099-01e57ca4ffb2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5097ec2e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9c:c4:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397441, 'reachable_time': 27396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222749, 'error': None, 'target': 'ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.834 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d989f6c1-ab3f-461b-a5c4-e0c09085a810]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.849 2 INFO nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.889 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4de7557e-dcd8-4f99-af83-25a3a1086fde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.890 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5097ec2e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.891 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.891 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5097ec2e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:36 np0005470441 NetworkManager[51690]: <info>  [1759556076.8933] manager: (tap5097ec2e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Oct  4 01:34:36 np0005470441 kernel: tap5097ec2e-e0: entered promiscuous mode
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.899 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5097ec2e-e0, col_values=(('external_ids', {'iface-id': '79e8a929-6b9f-4573-882b-7bde024a4549'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:36Z|00112|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:34:36 np0005470441 nova_compute[192626]: 2025-10-04 05:34:36.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.915 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.916 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b0b79cfb-752c-46ab-8dca-d5fae76d1d6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.916 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0.pid.haproxy
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:34:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:36.917 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'env', 'PROCESS_TAG=haproxy-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:34:37 np0005470441 podman[222787]: 2025-10-04 05:34:37.321174678 +0000 UTC m=+0.073626116 container create f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:37 np0005470441 podman[222787]: 2025-10-04 05:34:37.272877224 +0000 UTC m=+0.025328642 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:34:37 np0005470441 systemd[1]: Started libpod-conmon-f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8.scope.
Oct  4 01:34:37 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:34:37 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca1e7dd48eecf0e837db8d75affe5b61062178b53d6e20962d134d763e61f2ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:34:37 np0005470441 podman[222787]: 2025-10-04 05:34:37.461328527 +0000 UTC m=+0.213779955 container init f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:34:37 np0005470441 podman[222787]: 2025-10-04 05:34:37.473073591 +0000 UTC m=+0.225525029 container start f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  4 01:34:37 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [NOTICE]   (222805) : New worker (222810) forked
Oct  4 01:34:37 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [NOTICE]   (222805) : Loading success.
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.688 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556077.688296, 1701a941-088f-4d8d-99a0-3ab59e08de62 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.689 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] VM Started (Lifecycle Event)#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.709 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.712 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556077.688399, 1701a941-088f-4d8d-99a0-3ab59e08de62 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.713 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.729 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.732 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.749 2 DEBUG nova.network.neutron [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updated VIF entry in instance network info cache for port e049c33e-0d6a-464d-99f6-ec92be78f298. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.749 2 DEBUG nova.network.neutron [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updating instance_info_cache with network_info: [{"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.752 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.769 2 DEBUG oslo_concurrency.lockutils [req-22814de1-af84-49e3-b7d1-a247d5a595d8 req-5753b22b-6b5b-4ba7-b306-deabdd6dc2d2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.858 2 INFO nova.virt.libvirt.driver [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.862 2 DEBUG nova.compute.manager [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:37 np0005470441 nova_compute[192626]: 2025-10-04 05:34:37.880 2 DEBUG nova.objects.instance [None req-9d14b021-7253-4201-9acc-d1524455743d 477e7504334a42ac8570f3eebda65dba 22267f61db3147bea62c6049fa6ee6ed - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.585 2 DEBUG nova.compute.manager [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.586 2 DEBUG oslo_concurrency.lockutils [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.586 2 DEBUG oslo_concurrency.lockutils [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.586 2 DEBUG oslo_concurrency.lockutils [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.586 2 DEBUG nova.compute.manager [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Processing event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.587 2 DEBUG nova.compute.manager [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.587 2 DEBUG oslo_concurrency.lockutils [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.587 2 DEBUG oslo_concurrency.lockutils [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.587 2 DEBUG oslo_concurrency.lockutils [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.587 2 DEBUG nova.compute.manager [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] No waiting events found dispatching network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.588 2 WARNING nova.compute.manager [req-724452db-e787-4d98-ae31-568436704410 req-2d86cacb-223c-411f-a9ac-6baf78947c5e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received unexpected event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 for instance with vm_state building and task_state spawning.#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.588 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.592 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556078.5923555, 1701a941-088f-4d8d-99a0-3ab59e08de62 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.592 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.595 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.598 2 INFO nova.virt.libvirt.driver [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Instance spawned successfully.#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.598 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.615 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.622 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.625 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.625 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.626 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.626 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.627 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.627 2 DEBUG nova.virt.libvirt.driver [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.654 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.690 2 INFO nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Took 9.31 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.690 2 DEBUG nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.764 2 INFO nova.compute.manager [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Took 9.91 seconds to build instance.#033[00m
Oct  4 01:34:38 np0005470441 nova_compute[192626]: 2025-10-04 05:34:38.791 2 DEBUG oslo_concurrency.lockutils [None req-d134b9d6-7109-4ced-a615-2cb1ec43ad8e 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:40 np0005470441 podman[222819]: 2025-10-04 05:34:40.309567938 +0000 UTC m=+0.055662695 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  4 01:34:40 np0005470441 podman[222820]: 2025-10-04 05:34:40.318218715 +0000 UTC m=+0.054745309 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:34:40 np0005470441 nova_compute[192626]: 2025-10-04 05:34:40.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.037 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.037 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.038 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.038 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.038 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.039 2 INFO nova.compute.manager [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Terminating instance#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.040 2 DEBUG nova.compute.manager [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:34:41 np0005470441 kernel: tapc1d83b25-e4 (unregistering): left promiscuous mode
Oct  4 01:34:41 np0005470441 NetworkManager[51690]: <info>  [1759556081.0639] device (tapc1d83b25-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00113|binding|INFO|Releasing lport c1d83b25-e45b-4ecf-b8ba-f74235147b5a from this chassis (sb_readonly=0)
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00114|binding|INFO|Setting lport c1d83b25-e45b-4ecf-b8ba-f74235147b5a down in Southbound
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00115|binding|INFO|Releasing lport cc11434d-3b70-44c4-aeaa-3c368418a6c1 from this chassis (sb_readonly=0)
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00116|binding|INFO|Setting lport cc11434d-3b70-44c4-aeaa-3c368418a6c1 down in Southbound
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00117|binding|INFO|Removing iface tapc1d83b25-e4 ovn-installed in OVS
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.082 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:b2:19 10.100.0.12'], port_security=['fa:16:3e:e1:b2:19 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-882931926', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3d4747ed-5583-49d0-bc11-6ea6be7e8a5f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a984030f-c569-4bd0-83e0-9a6812d06f48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-882931926', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '11', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfeb7d9d-6193-40b2-b586-fa0e6ac8f060, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=c1d83b25-e45b-4ecf-b8ba-f74235147b5a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00118|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00119|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:34:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:41Z|00120|binding|INFO|Releasing lport d46606a2-6020-4260-95f5-96b5bff72200 from this chassis (sb_readonly=0)
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.084 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:61:97 19.80.0.147'], port_security=['fa:16:3e:75:61:97 19.80.0.147'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c1d83b25-e45b-4ecf-b8ba-f74235147b5a'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1026755307', 'neutron:cidrs': '19.80.0.147/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b29c042d-5444-48d3-95fd-56933a7e65a1', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1026755307', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '5', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=c8b5faf8-cac7-402d-ac27-5352e16c3484, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cc11434d-3b70-44c4-aeaa-3c368418a6c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.085 103689 INFO neutron.agent.ovn.metadata.agent [-] Port c1d83b25-e45b-4ecf-b8ba-f74235147b5a in datapath a984030f-c569-4bd0-83e0-9a6812d06f48 unbound from our chassis#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.088 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a984030f-c569-4bd0-83e0-9a6812d06f48#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.103 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[02914d7d-a887-4bef-bfa4-7cd82e39990f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.132 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[bb7a1b44-d118-4a5a-8f75-72d322762f58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.135 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c4990479-81f0-4bd7-b443-0f25ad1d96bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Oct  4 01:34:41 np0005470441 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 2.024s CPU time.
Oct  4 01:34:41 np0005470441 systemd-machined[152624]: Machine qemu-7-instance-0000000e terminated.
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.162 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5bcd6c-c2ae-49b3-8675-73d1ab115910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.180 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[251ae41f-e81c-4ac8-a052-34740f5e744d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa984030f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:44:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 40, 'tx_packets': 9, 'rx_bytes': 2176, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 40, 'tx_packets': 9, 'rx_bytes': 2176, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386672, 'reachable_time': 33947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222875, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.193 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9464704c-7e7b-4e67-885c-3d94d5113f68]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386682, 'tstamp': 386682}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222876, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386685, 'tstamp': 386685}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222876, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.195 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa984030f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.202 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa984030f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.202 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.203 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa984030f-c0, col_values=(('external_ids', {'iface-id': '3ea6f406-5ff7-4b46-9301-f23ee9be4b86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.203 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.204 103689 INFO neutron.agent.ovn.metadata.agent [-] Port cc11434d-3b70-44c4-aeaa-3c368418a6c1 in datapath b29c042d-5444-48d3-95fd-56933a7e65a1 unbound from our chassis#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.206 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b29c042d-5444-48d3-95fd-56933a7e65a1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.207 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[747524d4-0a35-4658-b55d-12462964a199]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.207 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1 namespace which is not needed anymore#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.300 2 INFO nova.virt.libvirt.driver [-] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Instance destroyed successfully.#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.301 2 DEBUG nova.objects.instance [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lazy-loading 'resources' on Instance uuid 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.317 2 DEBUG nova.virt.libvirt.vif [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-04T05:34:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1570737566',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1570737566',id=14,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:34:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a30d290b7ef45f3ade527507f03ce55',ramdisk_id='',reservation_id='r-z0dsngdn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-869616',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-869616-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:34:37Z,user_data=None,user_id='174330e695c64fc1ac9d921e330c5642',uuid=3d4747ed-5583-49d0-bc11-6ea6be7e8a5f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "address": "fa:16:3e:e1:b2:19", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d83b25-e4", "ovs_interfaceid": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.320 2 DEBUG nova.network.os_vif_util [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Converting VIF {"id": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "address": "fa:16:3e:e1:b2:19", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d83b25-e4", "ovs_interfaceid": "c1d83b25-e45b-4ecf-b8ba-f74235147b5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.321 2 DEBUG nova.network.os_vif_util [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e1:b2:19,bridge_name='br-int',has_traffic_filtering=True,id=c1d83b25-e45b-4ecf-b8ba-f74235147b5a,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc1d83b25-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.321 2 DEBUG os_vif [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:b2:19,bridge_name='br-int',has_traffic_filtering=True,id=c1d83b25-e45b-4ecf-b8ba-f74235147b5a,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc1d83b25-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.324 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1d83b25-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.331 2 INFO os_vif [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:b2:19,bridge_name='br-int',has_traffic_filtering=True,id=c1d83b25-e45b-4ecf-b8ba-f74235147b5a,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc1d83b25-e4')#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.332 2 INFO nova.virt.libvirt.driver [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Deleting instance files /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f_del#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.333 2 INFO nova.virt.libvirt.driver [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Deletion of /var/lib/nova/instances/3d4747ed-5583-49d0-bc11-6ea6be7e8a5f_del complete#033[00m
Oct  4 01:34:41 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [NOTICE]   (222683) : haproxy version is 2.8.14-c23fe91
Oct  4 01:34:41 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [NOTICE]   (222683) : path to executable is /usr/sbin/haproxy
Oct  4 01:34:41 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [WARNING]  (222683) : Exiting Master process...
Oct  4 01:34:41 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [ALERT]    (222683) : Current worker (222685) exited with code 143 (Terminated)
Oct  4 01:34:41 np0005470441 neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1[222679]: [WARNING]  (222683) : All workers exited. Exiting... (0)
Oct  4 01:34:41 np0005470441 systemd[1]: libpod-a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec.scope: Deactivated successfully.
Oct  4 01:34:41 np0005470441 podman[222908]: 2025-10-04 05:34:41.363138004 +0000 UTC m=+0.062686266 container died a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.400 2 INFO nova.compute.manager [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.401 2 DEBUG oslo.service.loopingcall [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.403 2 DEBUG nova.compute.manager [-] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.403 2 DEBUG nova.network.neutron [-] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:34:41 np0005470441 systemd[1]: var-lib-containers-storage-overlay-c801eb83d91db244270e1a9e6dda082e6c81da0c9688fec309758c22e6a9ae7f-merged.mount: Deactivated successfully.
Oct  4 01:34:41 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec-userdata-shm.mount: Deactivated successfully.
Oct  4 01:34:41 np0005470441 podman[222908]: 2025-10-04 05:34:41.444385096 +0000 UTC m=+0.143933338 container cleanup a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:34:41 np0005470441 systemd[1]: libpod-conmon-a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec.scope: Deactivated successfully.
Oct  4 01:34:41 np0005470441 podman[222940]: 2025-10-04 05:34:41.496586622 +0000 UTC m=+0.034712469 container remove a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.502 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ce643d38-d0a3-4367-a347-427a42bebf9e]: (4, ('Sat Oct  4 05:34:41 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1 (a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec)\na888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec\nSat Oct  4 05:34:41 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1 (a888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec)\na888cf63fdafd480fa768284a6bd94883f89a8fc8d64aa17ea507f6b92926bec\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.503 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[047173e0-cd0e-4a14-91c3-1262b621ee7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.504 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb29c042d-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 kernel: tapb29c042d-50: left promiscuous mode
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.549 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[28fe68a9-887e-4a4c-b550-3235213d0df7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.574 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[93515792-eef1-40b1-8c94-d04f8aa895a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.575 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f40b8701-00ac-4c21-b523-7b0954f85dcc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.593 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[403ffe77-0eb4-49b3-9194-bdb8fdbabe55]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397088, 'reachable_time': 20568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222955, 'error': None, 'target': 'ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 systemd[1]: run-netns-ovnmeta\x2db29c042d\x2d5444\x2d48d3\x2d95fd\x2d56933a7e65a1.mount: Deactivated successfully.
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.597 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b29c042d-5444-48d3-95fd-56933a7e65a1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:34:41 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:41.597 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[0f503ff0-6035-499f-b321-f310c405095b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.893 2 DEBUG nova.compute.manager [req-0626dc22-f58a-42c9-83c1-0222cfef840c req-ab688acd-7b81-4ff7-822e-89bbc18ee944 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Received event network-vif-unplugged-c1d83b25-e45b-4ecf-b8ba-f74235147b5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.893 2 DEBUG oslo_concurrency.lockutils [req-0626dc22-f58a-42c9-83c1-0222cfef840c req-ab688acd-7b81-4ff7-822e-89bbc18ee944 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.894 2 DEBUG oslo_concurrency.lockutils [req-0626dc22-f58a-42c9-83c1-0222cfef840c req-ab688acd-7b81-4ff7-822e-89bbc18ee944 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.894 2 DEBUG oslo_concurrency.lockutils [req-0626dc22-f58a-42c9-83c1-0222cfef840c req-ab688acd-7b81-4ff7-822e-89bbc18ee944 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.894 2 DEBUG nova.compute.manager [req-0626dc22-f58a-42c9-83c1-0222cfef840c req-ab688acd-7b81-4ff7-822e-89bbc18ee944 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] No waiting events found dispatching network-vif-unplugged-c1d83b25-e45b-4ecf-b8ba-f74235147b5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:41 np0005470441 nova_compute[192626]: 2025-10-04 05:34:41.894 2 DEBUG nova.compute.manager [req-0626dc22-f58a-42c9-83c1-0222cfef840c req-ab688acd-7b81-4ff7-822e-89bbc18ee944 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Received event network-vif-unplugged-c1d83b25-e45b-4ecf-b8ba-f74235147b5a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:34:42 np0005470441 nova_compute[192626]: 2025-10-04 05:34:42.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:43 np0005470441 NetworkManager[51690]: <info>  [1759556083.4404] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct  4 01:34:43 np0005470441 NetworkManager[51690]: <info>  [1759556083.4413] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct  4 01:34:43 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:43Z|00121|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:34:43 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:43Z|00122|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:34:43 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:43Z|00123|binding|INFO|Releasing lport 3ea6f406-5ff7-4b46-9301-f23ee9be4b86 from this chassis (sb_readonly=0)
Oct  4 01:34:43 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:43Z|00124|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.992 2 DEBUG nova.compute.manager [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Received event network-vif-plugged-c1d83b25-e45b-4ecf-b8ba-f74235147b5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.992 2 DEBUG oslo_concurrency.lockutils [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.993 2 DEBUG oslo_concurrency.lockutils [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.993 2 DEBUG oslo_concurrency.lockutils [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.993 2 DEBUG nova.compute.manager [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] No waiting events found dispatching network-vif-plugged-c1d83b25-e45b-4ecf-b8ba-f74235147b5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.993 2 WARNING nova.compute.manager [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Received unexpected event network-vif-plugged-c1d83b25-e45b-4ecf-b8ba-f74235147b5a for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.993 2 DEBUG nova.compute.manager [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-changed-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.993 2 DEBUG nova.compute.manager [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Refreshing instance network info cache due to event network-changed-e049c33e-0d6a-464d-99f6-ec92be78f298. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.994 2 DEBUG oslo_concurrency.lockutils [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.994 2 DEBUG oslo_concurrency.lockutils [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:34:43 np0005470441 nova_compute[192626]: 2025-10-04 05:34:43.994 2 DEBUG nova.network.neutron [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Refreshing network info cache for port e049c33e-0d6a-464d-99f6-ec92be78f298 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.038 2 DEBUG nova.network.neutron [-] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.057 2 INFO nova.compute.manager [-] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Took 2.65 seconds to deallocate network for instance.#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.113 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.113 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.120 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.170 2 INFO nova.scheduler.client.report [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Deleted allocations for instance 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f#033[00m
Oct  4 01:34:44 np0005470441 nova_compute[192626]: 2025-10-04 05:34:44.268 2 DEBUG oslo_concurrency.lockutils [None req-e3af5603-cd1d-4c09-abba-e6b33f75c2a6 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "3d4747ed-5583-49d0-bc11-6ea6be7e8a5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:45 np0005470441 podman[222957]: 2025-10-04 05:34:45.309182768 +0000 UTC m=+0.055629084 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:34:45 np0005470441 podman[222958]: 2025-10-04 05:34:45.316177987 +0000 UTC m=+0.059539695 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:34:46 np0005470441 nova_compute[192626]: 2025-10-04 05:34:46.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:47 np0005470441 nova_compute[192626]: 2025-10-04 05:34:47.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:48 np0005470441 nova_compute[192626]: 2025-10-04 05:34:48.788 2 DEBUG nova.network.neutron [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updated VIF entry in instance network info cache for port e049c33e-0d6a-464d-99f6-ec92be78f298. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:34:48 np0005470441 nova_compute[192626]: 2025-10-04 05:34:48.789 2 DEBUG nova.network.neutron [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updating instance_info_cache with network_info: [{"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:48 np0005470441 nova_compute[192626]: 2025-10-04 05:34:48.810 2 DEBUG oslo_concurrency.lockutils [req-0b8a1571-0fc7-4150-98a6-95d1ad5b89aa req-547eec90-635a-415f-bfb2-10438c3b4a24 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.473 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "286feccf-0ffd-498c-8db5-7128a3d0f965" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.474 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.474 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.474 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.474 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.476 2 INFO nova.compute.manager [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Terminating instance#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.477 2 DEBUG nova.compute.manager [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:34:50 np0005470441 kernel: tape82ee3ec-eb (unregistering): left promiscuous mode
Oct  4 01:34:50 np0005470441 NetworkManager[51690]: <info>  [1759556090.5047] device (tape82ee3ec-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:50Z|00125|binding|INFO|Releasing lport e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 from this chassis (sb_readonly=0)
Oct  4 01:34:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:50Z|00126|binding|INFO|Setting lport e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 down in Southbound
Oct  4 01:34:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:50Z|00127|binding|INFO|Removing iface tape82ee3ec-eb ovn-installed in OVS
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.519 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:8d:c7 10.100.0.3'], port_security=['fa:16:3e:4e:8d:c7 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '286feccf-0ffd-498c-8db5-7128a3d0f965', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a984030f-c569-4bd0-83e0-9a6812d06f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '13', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfeb7d9d-6193-40b2-b586-fa0e6ac8f060, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.520 103689 INFO neutron.agent.ovn.metadata.agent [-] Port e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 in datapath a984030f-c569-4bd0-83e0-9a6812d06f48 unbound from our chassis#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.522 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a984030f-c569-4bd0-83e0-9a6812d06f48#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.536 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c9212afc-df1b-4fd1-a2e6-d2f11ec87089]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:50 np0005470441 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct  4 01:34:50 np0005470441 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000a.scope: Consumed 1.908s CPU time.
Oct  4 01:34:50 np0005470441 systemd-machined[152624]: Machine qemu-6-instance-0000000a terminated.
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.568 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[495f142f-a060-4020-90d7-0461259feae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.571 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[56267fae-89c3-4aec-b404-28260ae21c76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.603 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[20c8eb00-15c7-4efc-8713-1dee50da3dd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.621 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9cdc3e-c35b-467a-8aa4-39bf10d50188]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa984030f-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:44:91'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 40, 'tx_packets': 11, 'rx_bytes': 2176, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 40, 'tx_packets': 11, 'rx_bytes': 2176, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386672, 'reachable_time': 33947, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223020, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.636 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9245462b-2ea2-4903-800d-dd77dea54fa3]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386682, 'tstamp': 386682}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223021, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapa984030f-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386685, 'tstamp': 386685}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223021, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.638 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa984030f-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.643 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa984030f-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.643 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.644 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa984030f-c0, col_values=(('external_ids', {'iface-id': '3ea6f406-5ff7-4b46-9301-f23ee9be4b86'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:50.644 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.699 2 DEBUG nova.compute.manager [req-195899dd-73a6-4f84-8095-7eddb21e5681 req-5e734eea-6213-4a74-92b6-e63206fdde74 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Received event network-vif-unplugged-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.699 2 DEBUG oslo_concurrency.lockutils [req-195899dd-73a6-4f84-8095-7eddb21e5681 req-5e734eea-6213-4a74-92b6-e63206fdde74 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.700 2 DEBUG oslo_concurrency.lockutils [req-195899dd-73a6-4f84-8095-7eddb21e5681 req-5e734eea-6213-4a74-92b6-e63206fdde74 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.700 2 DEBUG oslo_concurrency.lockutils [req-195899dd-73a6-4f84-8095-7eddb21e5681 req-5e734eea-6213-4a74-92b6-e63206fdde74 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.700 2 DEBUG nova.compute.manager [req-195899dd-73a6-4f84-8095-7eddb21e5681 req-5e734eea-6213-4a74-92b6-e63206fdde74 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] No waiting events found dispatching network-vif-unplugged-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.700 2 DEBUG nova.compute.manager [req-195899dd-73a6-4f84-8095-7eddb21e5681 req-5e734eea-6213-4a74-92b6-e63206fdde74 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Received event network-vif-unplugged-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.739 2 INFO nova.virt.libvirt.driver [-] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Instance destroyed successfully.#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.740 2 DEBUG nova.objects.instance [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lazy-loading 'resources' on Instance uuid 286feccf-0ffd-498c-8db5-7128a3d0f965 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.756 2 DEBUG nova.virt.libvirt.vif [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-04T05:33:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1864739972',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1864739972',id=10,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:33:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=3,progress=0,project_id='0a30d290b7ef45f3ade527507f03ce55',ramdisk_id='',reservation_id='r-rxx7vjpi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-869616',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-869616-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:33:50Z,user_data=None,user_id='174330e695c64fc1ac9d921e330c5642',uuid=286feccf-0ffd-498c-8db5-7128a3d0f965,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='paused') vif={"id": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "address": "fa:16:3e:4e:8d:c7", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82ee3ec-eb", "ovs_interfaceid": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.756 2 DEBUG nova.network.os_vif_util [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Converting VIF {"id": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "address": "fa:16:3e:4e:8d:c7", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape82ee3ec-eb", "ovs_interfaceid": "e82ee3ec-eb7b-4866-97bb-a0e71ab7a510", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.757 2 DEBUG nova.network.os_vif_util [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:8d:c7,bridge_name='br-int',has_traffic_filtering=True,id=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape82ee3ec-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.757 2 DEBUG os_vif [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:8d:c7,bridge_name='br-int',has_traffic_filtering=True,id=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape82ee3ec-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.759 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape82ee3ec-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.765 2 INFO os_vif [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:8d:c7,bridge_name='br-int',has_traffic_filtering=True,id=e82ee3ec-eb7b-4866-97bb-a0e71ab7a510,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape82ee3ec-eb')#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.765 2 INFO nova.virt.libvirt.driver [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Deleting instance files /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965_del#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.766 2 INFO nova.virt.libvirt.driver [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Deletion of /var/lib/nova/instances/286feccf-0ffd-498c-8db5-7128a3d0f965_del complete#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.811 2 INFO nova.compute.manager [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.812 2 DEBUG oslo.service.loopingcall [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.812 2 DEBUG nova.compute.manager [-] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:34:50 np0005470441 nova_compute[192626]: 2025-10-04 05:34:50.812 2 DEBUG nova.network.neutron [-] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:34:51 np0005470441 nova_compute[192626]: 2025-10-04 05:34:51.847 2 DEBUG nova.network.neutron [-] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:51 np0005470441 nova_compute[192626]: 2025-10-04 05:34:51.861 2 INFO nova.compute.manager [-] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Took 1.05 seconds to deallocate network for instance.#033[00m
Oct  4 01:34:51 np0005470441 nova_compute[192626]: 2025-10-04 05:34:51.912 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:51 np0005470441 nova_compute[192626]: 2025-10-04 05:34:51.913 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:51 np0005470441 nova_compute[192626]: 2025-10-04 05:34:51.989 2 DEBUG nova.compute.provider_tree [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.004 2 DEBUG nova.scheduler.client.report [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.025 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.173 2 INFO nova.scheduler.client.report [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Deleted allocations for instance 286feccf-0ffd-498c-8db5-7128a3d0f965#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.226 2 DEBUG oslo_concurrency.lockutils [None req-3d3f332e-fda7-496e-ae83-d6f1b8218a4d 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:52Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0c:3c:e1 10.100.0.13
Oct  4 01:34:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:52Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0c:3c:e1 10.100.0.13
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.775 2 DEBUG nova.compute.manager [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Received event network-vif-plugged-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.775 2 DEBUG oslo_concurrency.lockutils [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.775 2 DEBUG oslo_concurrency.lockutils [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.776 2 DEBUG oslo_concurrency.lockutils [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "286feccf-0ffd-498c-8db5-7128a3d0f965-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.776 2 DEBUG nova.compute.manager [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] No waiting events found dispatching network-vif-plugged-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.776 2 WARNING nova.compute.manager [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Received unexpected event network-vif-plugged-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.776 2 DEBUG nova.compute.manager [req-2129160d-18ad-4c46-9377-72648932839d req-8e53df8d-796d-4e5b-9aa8-1066fcb66317 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Received event network-vif-deleted-e82ee3ec-eb7b-4866-97bb-a0e71ab7a510 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.847 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.848 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.848 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.848 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.848 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.849 2 INFO nova.compute.manager [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Terminating instance#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.850 2 DEBUG nova.compute.manager [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:34:52 np0005470441 kernel: tapb0eb2882-c3 (unregistering): left promiscuous mode
Oct  4 01:34:52 np0005470441 NetworkManager[51690]: <info>  [1759556092.8810] device (tapb0eb2882-c3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:52Z|00128|binding|INFO|Releasing lport b0eb2882-c375-490a-9308-11da20a838e8 from this chassis (sb_readonly=0)
Oct  4 01:34:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:52Z|00129|binding|INFO|Setting lport b0eb2882-c375-490a-9308-11da20a838e8 down in Southbound
Oct  4 01:34:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:52Z|00130|binding|INFO|Removing iface tapb0eb2882-c3 ovn-installed in OVS
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:52.900 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:d8:fc 10.100.0.7'], port_security=['fa:16:3e:b3:d8:fc 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'b89756b5-b481-4ad9-aaf8-afda62b5d1bc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a984030f-c569-4bd0-83e0-9a6812d06f48', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a30d290b7ef45f3ade527507f03ce55', 'neutron:revision_number': '11', 'neutron:security_group_ids': '94b6fae9-83b1-4167-ab83-cf5d2163195e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bfeb7d9d-6193-40b2-b586-fa0e6ac8f060, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=b0eb2882-c375-490a-9308-11da20a838e8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:34:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:52.901 103689 INFO neutron.agent.ovn.metadata.agent [-] Port b0eb2882-c375-490a-9308-11da20a838e8 in datapath a984030f-c569-4bd0-83e0-9a6812d06f48 unbound from our chassis#033[00m
Oct  4 01:34:52 np0005470441 nova_compute[192626]: 2025-10-04 05:34:52.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:52.902 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a984030f-c569-4bd0-83e0-9a6812d06f48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:34:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:52.903 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3aca42-82dd-4eba-9703-10dc28dc6186]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:52.905 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48 namespace which is not needed anymore#033[00m
Oct  4 01:34:52 np0005470441 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct  4 01:34:52 np0005470441 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000003.scope: Consumed 6.515s CPU time.
Oct  4 01:34:52 np0005470441 systemd-machined[152624]: Machine qemu-4-instance-00000003 terminated.
Oct  4 01:34:52 np0005470441 podman[223043]: 2025-10-04 05:34:52.963430729 +0000 UTC m=+0.056263012 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7)
Oct  4 01:34:53 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [NOTICE]   (221400) : haproxy version is 2.8.14-c23fe91
Oct  4 01:34:53 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [NOTICE]   (221400) : path to executable is /usr/sbin/haproxy
Oct  4 01:34:53 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [WARNING]  (221400) : Exiting Master process...
Oct  4 01:34:53 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [ALERT]    (221400) : Current worker (221402) exited with code 143 (Terminated)
Oct  4 01:34:53 np0005470441 neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48[221378]: [WARNING]  (221400) : All workers exited. Exiting... (0)
Oct  4 01:34:53 np0005470441 systemd[1]: libpod-4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4.scope: Deactivated successfully.
Oct  4 01:34:53 np0005470441 podman[223080]: 2025-10-04 05:34:53.025056773 +0000 UTC m=+0.042799999 container died 4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 01:34:53 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4-userdata-shm.mount: Deactivated successfully.
Oct  4 01:34:53 np0005470441 systemd[1]: var-lib-containers-storage-overlay-95d891736d1b9070e752cbe59dd0bc48203c1aea23c924d741850adfa30c7d15-merged.mount: Deactivated successfully.
Oct  4 01:34:53 np0005470441 podman[223080]: 2025-10-04 05:34:53.064222567 +0000 UTC m=+0.081965793 container cleanup 4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:53 np0005470441 systemd[1]: libpod-conmon-4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4.scope: Deactivated successfully.
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.104 2 INFO nova.virt.libvirt.driver [-] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Instance destroyed successfully.#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.104 2 DEBUG nova.objects.instance [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lazy-loading 'resources' on Instance uuid b89756b5-b481-4ad9-aaf8-afda62b5d1bc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.117 2 DEBUG nova.virt.libvirt.vif [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-04T05:31:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-155332389',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-155332389',id=3,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:32:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a30d290b7ef45f3ade527507f03ce55',ramdisk_id='',reservation_id='r-fsfv578k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-869616',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-869616-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:32:56Z,user_data=None,user_id='174330e695c64fc1ac9d921e330c5642',uuid=b89756b5-b481-4ad9-aaf8-afda62b5d1bc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b0eb2882-c375-490a-9308-11da20a838e8", "address": "fa:16:3e:b3:d8:fc", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0eb2882-c3", "ovs_interfaceid": "b0eb2882-c375-490a-9308-11da20a838e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.117 2 DEBUG nova.network.os_vif_util [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Converting VIF {"id": "b0eb2882-c375-490a-9308-11da20a838e8", "address": "fa:16:3e:b3:d8:fc", "network": {"id": "a984030f-c569-4bd0-83e0-9a6812d06f48", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1229007013-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a30d290b7ef45f3ade527507f03ce55", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb0eb2882-c3", "ovs_interfaceid": "b0eb2882-c375-490a-9308-11da20a838e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.118 2 DEBUG nova.network.os_vif_util [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b3:d8:fc,bridge_name='br-int',has_traffic_filtering=True,id=b0eb2882-c375-490a-9308-11da20a838e8,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0eb2882-c3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.118 2 DEBUG os_vif [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:d8:fc,bridge_name='br-int',has_traffic_filtering=True,id=b0eb2882-c375-490a-9308-11da20a838e8,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0eb2882-c3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.120 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb0eb2882-c3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.125 2 INFO os_vif [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b3:d8:fc,bridge_name='br-int',has_traffic_filtering=True,id=b0eb2882-c375-490a-9308-11da20a838e8,network=Network(a984030f-c569-4bd0-83e0-9a6812d06f48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb0eb2882-c3')#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.126 2 INFO nova.virt.libvirt.driver [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Deleting instance files /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc_del#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.126 2 INFO nova.virt.libvirt.driver [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Deletion of /var/lib/nova/instances/b89756b5-b481-4ad9-aaf8-afda62b5d1bc_del complete#033[00m
Oct  4 01:34:53 np0005470441 podman[223115]: 2025-10-04 05:34:53.151972835 +0000 UTC m=+0.062184201 container remove 4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001)
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.157 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1b5f87f0-f0fa-4ca9-a206-4f6ebc7f7de1]: (4, ('Sat Oct  4 05:34:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48 (4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4)\n4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4\nSat Oct  4 05:34:53 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48 (4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4)\n4380a43d6536b6d00fda955a19167965ebcffeb02b8c87c47b20077db3e8cce4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.158 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[86af21fe-a969-4b63-8df2-256da98e6027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.159 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa984030f-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:34:53 np0005470441 kernel: tapa984030f-c0: left promiscuous mode
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.165 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a86c5076-84c7-4bc5-a81d-0cc2efb098c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.171 2 INFO nova.compute.manager [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.171 2 DEBUG oslo.service.loopingcall [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.172 2 DEBUG nova.compute.manager [-] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.172 2 DEBUG nova.network.neutron [-] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.194 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[48bfb581-cd48-4fb9-b672-97c407925049]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.196 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ef9f0606-8932-4435-b964-6e53cec6924b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.212 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0b64311c-8ab5-4693-b6a5-0bc7b00ef018]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386664, 'reachable_time': 17696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223141, 'error': None, 'target': 'ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 systemd[1]: run-netns-ovnmeta\x2da984030f\x2dc569\x2d4bd0\x2d83e0\x2d9a6812d06f48.mount: Deactivated successfully.
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.215 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a984030f-c569-4bd0-83e0-9a6812d06f48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:34:53 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:34:53.215 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[978c11a5-f36a-4238-98e7-1e7041d77b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:53 np0005470441 nova_compute[192626]: 2025-10-04 05:34:53.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.059 2 DEBUG nova.network.neutron [-] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.075 2 INFO nova.compute.manager [-] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Took 0.90 seconds to deallocate network for instance.#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.131 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.131 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.272 2 DEBUG nova.compute.provider_tree [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.296 2 DEBUG nova.scheduler.client.report [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.324 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.345 2 INFO nova.scheduler.client.report [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Deleted allocations for instance b89756b5-b481-4ad9-aaf8-afda62b5d1bc#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.398 2 DEBUG oslo_concurrency.lockutils [None req-0c423ad3-1973-4cf4-b11f-1c1e991a71bc 174330e695c64fc1ac9d921e330c5642 0a30d290b7ef45f3ade527507f03ce55 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.862 2 DEBUG nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Received event network-vif-unplugged-b0eb2882-c375-490a-9308-11da20a838e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.863 2 DEBUG oslo_concurrency.lockutils [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.863 2 DEBUG oslo_concurrency.lockutils [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.864 2 DEBUG oslo_concurrency.lockutils [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.864 2 DEBUG nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] No waiting events found dispatching network-vif-unplugged-b0eb2882-c375-490a-9308-11da20a838e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.865 2 WARNING nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Received unexpected event network-vif-unplugged-b0eb2882-c375-490a-9308-11da20a838e8 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.865 2 DEBUG nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Received event network-vif-plugged-b0eb2882-c375-490a-9308-11da20a838e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.865 2 DEBUG oslo_concurrency.lockutils [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.866 2 DEBUG oslo_concurrency.lockutils [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.866 2 DEBUG oslo_concurrency.lockutils [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "b89756b5-b481-4ad9-aaf8-afda62b5d1bc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.866 2 DEBUG nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] No waiting events found dispatching network-vif-plugged-b0eb2882-c375-490a-9308-11da20a838e8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.867 2 WARNING nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Received unexpected event network-vif-plugged-b0eb2882-c375-490a-9308-11da20a838e8 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:34:54 np0005470441 nova_compute[192626]: 2025-10-04 05:34:54.867 2 DEBUG nova.compute.manager [req-7cc6ced5-d0d5-45c6-8c8f-12733f6a291f req-42b49e7a-a1ca-40cf-a8d3-d987501fb3c8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Received event network-vif-deleted-b0eb2882-c375-490a-9308-11da20a838e8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.764 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.765 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.783 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.783 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.784 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.784 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.848 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:55 np0005470441 podman[223143]: 2025-10-04 05:34:55.880557142 +0000 UTC m=+0.056268712 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.920 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.921 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:34:55 np0005470441 nova_compute[192626]: 2025-10-04 05:34:55.975 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.109 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.110 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5550MB free_disk=73.43881607055664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.110 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.110 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.185 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 1701a941-088f-4d8d-99a0-3ab59e08de62 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.186 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.186 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.234 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.261 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.287 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.288 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.295 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556081.2948923, 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.296 2 INFO nova.compute.manager [-] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:34:56 np0005470441 nova_compute[192626]: 2025-10-04 05:34:56.321 2 DEBUG nova.compute.manager [None req-21aaa415-7ecd-4cfa-bf06-c5e53fd96c45 - - - - - -] [instance: 3d4747ed-5583-49d0-bc11-6ea6be7e8a5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:34:57 np0005470441 nova_compute[192626]: 2025-10-04 05:34:57.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:57 np0005470441 nova_compute[192626]: 2025-10-04 05:34:57.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:57 np0005470441 nova_compute[192626]: 2025-10-04 05:34:57.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:57 np0005470441 nova_compute[192626]: 2025-10-04 05:34:57.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:57 np0005470441 nova_compute[192626]: 2025-10-04 05:34:57.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 01:34:57 np0005470441 nova_compute[192626]: 2025-10-04 05:34:57.770 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 01:34:58 np0005470441 nova_compute[192626]: 2025-10-04 05:34:58.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:58 np0005470441 podman[223174]: 2025-10-04 05:34:58.30667142 +0000 UTC m=+0.054347918 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:34:58 np0005470441 nova_compute[192626]: 2025-10-04 05:34:58.769 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:58 np0005470441 nova_compute[192626]: 2025-10-04 05:34:58.769 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:34:58 np0005470441 nova_compute[192626]: 2025-10-04 05:34:58.788 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:34:58 np0005470441 nova_compute[192626]: 2025-10-04 05:34:58.789 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:34:58 np0005470441 nova_compute[192626]: 2025-10-04 05:34:58.789 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:34:59 np0005470441 ovn_controller[94840]: 2025-10-04T05:34:59Z|00131|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:34:59 np0005470441 nova_compute[192626]: 2025-10-04 05:34:59.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:34:59 np0005470441 nova_compute[192626]: 2025-10-04 05:34:59.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:01 np0005470441 podman[223194]: 2025-10-04 05:35:01.344413173 +0000 UTC m=+0.094260664 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:35:02 np0005470441 nova_compute[192626]: 2025-10-04 05:35:02.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:03 np0005470441 nova_compute[192626]: 2025-10-04 05:35:03.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:03 np0005470441 nova_compute[192626]: 2025-10-04 05:35:03.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:03 np0005470441 nova_compute[192626]: 2025-10-04 05:35:03.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:04 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:04Z|00132|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:35:04 np0005470441 nova_compute[192626]: 2025-10-04 05:35:04.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:05 np0005470441 nova_compute[192626]: 2025-10-04 05:35:05.738 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556090.7369282, 286feccf-0ffd-498c-8db5-7128a3d0f965 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:35:05 np0005470441 nova_compute[192626]: 2025-10-04 05:35:05.738 2 INFO nova.compute.manager [-] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:35:05 np0005470441 nova_compute[192626]: 2025-10-04 05:35:05.762 2 DEBUG nova.compute.manager [None req-14897ef0-1829-4dd8-93cd-0a58ce46e234 - - - - - -] [instance: 286feccf-0ffd-498c-8db5-7128a3d0f965] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:06.738 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:06.739 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:06.739 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:07 np0005470441 nova_compute[192626]: 2025-10-04 05:35:07.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:08 np0005470441 nova_compute[192626]: 2025-10-04 05:35:08.102 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556093.1010582, b89756b5-b481-4ad9-aaf8-afda62b5d1bc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:35:08 np0005470441 nova_compute[192626]: 2025-10-04 05:35:08.103 2 INFO nova.compute.manager [-] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:35:08 np0005470441 nova_compute[192626]: 2025-10-04 05:35:08.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:08 np0005470441 nova_compute[192626]: 2025-10-04 05:35:08.128 2 DEBUG nova.compute.manager [None req-76522d20-4ac3-4b4d-88b2-6f467b09fdba - - - - - -] [instance: b89756b5-b481-4ad9-aaf8-afda62b5d1bc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:11 np0005470441 podman[223221]: 2025-10-04 05:35:11.296304036 +0000 UTC m=+0.051508857 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:35:11 np0005470441 podman[223222]: 2025-10-04 05:35:11.296328797 +0000 UTC m=+0.048825271 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:35:12 np0005470441 nova_compute[192626]: 2025-10-04 05:35:12.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:13 np0005470441 nova_compute[192626]: 2025-10-04 05:35:13.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:13 np0005470441 nova_compute[192626]: 2025-10-04 05:35:13.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:14.114 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:35:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:14.114 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:35:14 np0005470441 nova_compute[192626]: 2025-10-04 05:35:14.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:14 np0005470441 nova_compute[192626]: 2025-10-04 05:35:14.479 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:14 np0005470441 nova_compute[192626]: 2025-10-04 05:35:14.503 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Triggering sync for uuid 1701a941-088f-4d8d-99a0-3ab59e08de62 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  4 01:35:14 np0005470441 nova_compute[192626]: 2025-10-04 05:35:14.504 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:14 np0005470441 nova_compute[192626]: 2025-10-04 05:35:14.504 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:14 np0005470441 nova_compute[192626]: 2025-10-04 05:35:14.552 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.047s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:15.117 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:16 np0005470441 podman[223262]: 2025-10-04 05:35:16.298456307 +0000 UTC m=+0.051613570 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  4 01:35:16 np0005470441 podman[223263]: 2025-10-04 05:35:16.303636975 +0000 UTC m=+0.054250925 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:35:17 np0005470441 nova_compute[192626]: 2025-10-04 05:35:17.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:17Z|00133|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:35:17 np0005470441 nova_compute[192626]: 2025-10-04 05:35:17.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:18 np0005470441 nova_compute[192626]: 2025-10-04 05:35:18.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:21 np0005470441 nova_compute[192626]: 2025-10-04 05:35:21.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:22 np0005470441 nova_compute[192626]: 2025-10-04 05:35:22.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:23 np0005470441 nova_compute[192626]: 2025-10-04 05:35:23.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:23 np0005470441 podman[223300]: 2025-10-04 05:35:23.303413961 +0000 UTC m=+0.058610129 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Oct  4 01:35:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:25Z|00134|binding|INFO|Releasing lport 79e8a929-6b9f-4573-882b-7bde024a4549 from this chassis (sb_readonly=0)
Oct  4 01:35:25 np0005470441 nova_compute[192626]: 2025-10-04 05:35:25.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:26 np0005470441 podman[223321]: 2025-10-04 05:35:26.301201137 +0000 UTC m=+0.055708306 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:35:27 np0005470441 nova_compute[192626]: 2025-10-04 05:35:27.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:28 np0005470441 nova_compute[192626]: 2025-10-04 05:35:28.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:29 np0005470441 podman[223345]: 2025-10-04 05:35:29.304434939 +0000 UTC m=+0.052311049 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.405 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.406 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.449 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.545 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.546 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.553 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.553 2 INFO nova.compute.claims [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.690 2 DEBUG nova.compute.provider_tree [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.713 2 DEBUG nova.scheduler.client.report [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.746 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.747 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.801 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.802 2 DEBUG nova.network.neutron [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.831 2 INFO nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.853 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.980 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.982 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.982 2 INFO nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Creating image(s)#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.983 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.983 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.984 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:31 np0005470441 nova_compute[192626]: 2025-10-04 05:35:31.996 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.033 2 DEBUG nova.policy [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.050 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.050 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.051 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.062 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.114 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.115 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.233 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk 1073741824" returned: 0 in 0.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.234 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.235 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.322 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.323 2 DEBUG nova.virt.disk.api [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.324 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:32 np0005470441 podman[223373]: 2025-10-04 05:35:32.371163528 +0000 UTC m=+0.121171999 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.377 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.378 2 DEBUG nova.virt.disk.api [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.378 2 DEBUG nova.objects.instance [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.393 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.394 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Ensure instance console log exists: /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.394 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.395 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:32 np0005470441 nova_compute[192626]: 2025-10-04 05:35:32.395 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.364 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.365 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.365 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.365 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.365 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.367 2 INFO nova.compute.manager [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Terminating instance#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.368 2 DEBUG nova.compute.manager [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:35:33 np0005470441 kernel: tape049c33e-0d (unregistering): left promiscuous mode
Oct  4 01:35:33 np0005470441 NetworkManager[51690]: <info>  [1759556133.4058] device (tape049c33e-0d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:35:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:33Z|00135|binding|INFO|Releasing lport e049c33e-0d6a-464d-99f6-ec92be78f298 from this chassis (sb_readonly=0)
Oct  4 01:35:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:33Z|00136|binding|INFO|Setting lport e049c33e-0d6a-464d-99f6-ec92be78f298 down in Southbound
Oct  4 01:35:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:33Z|00137|binding|INFO|Removing iface tape049c33e-0d ovn-installed in OVS
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.452 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:3c:e1 10.100.0.13'], port_security=['fa:16:3e:0c:3c:e1 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1701a941-088f-4d8d-99a0-3ab59e08de62', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '77799376-2a50-4044-a8c2-dc7e983782a3 e6007abc-b196-4efc-89a1-b67345655b56', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7246ef1-86de-4684-8576-21e21bc385cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=e049c33e-0d6a-464d-99f6-ec92be78f298) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.454 103689 INFO neutron.agent.ovn.metadata.agent [-] Port e049c33e-0d6a-464d-99f6-ec92be78f298 in datapath 5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 unbound from our chassis#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.455 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.456 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[72b0842a-fa32-4265-9136-6680969e5937]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.456 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 namespace which is not needed anymore#033[00m
Oct  4 01:35:33 np0005470441 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Oct  4 01:35:33 np0005470441 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000f.scope: Consumed 15.038s CPU time.
Oct  4 01:35:33 np0005470441 systemd-machined[152624]: Machine qemu-8-instance-0000000f terminated.
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.544 2 DEBUG nova.network.neutron [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Successfully created port: 74297ebf-db77-4cdc-a627-f0123223bbd8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.625 2 INFO nova.virt.libvirt.driver [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Instance destroyed successfully.#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.626 2 DEBUG nova.objects.instance [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'resources' on Instance uuid 1701a941-088f-4d8d-99a0-3ab59e08de62 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:35:33 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [NOTICE]   (222805) : haproxy version is 2.8.14-c23fe91
Oct  4 01:35:33 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [NOTICE]   (222805) : path to executable is /usr/sbin/haproxy
Oct  4 01:35:33 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [WARNING]  (222805) : Exiting Master process...
Oct  4 01:35:33 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [WARNING]  (222805) : Exiting Master process...
Oct  4 01:35:33 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [ALERT]    (222805) : Current worker (222810) exited with code 143 (Terminated)
Oct  4 01:35:33 np0005470441 neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0[222801]: [WARNING]  (222805) : All workers exited. Exiting... (0)
Oct  4 01:35:33 np0005470441 systemd[1]: libpod-f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8.scope: Deactivated successfully.
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.644 2 DEBUG nova.virt.libvirt.vif [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:34:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-642336620',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=15,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDX2VyBFoMVwGa9zc0KZ9bhtXhBJhCrhuTKG5hmyU8IpbM0X+Ixaq5iKC9FhofbzRwnAp3tguUaWpO8P70VfYAFlVnPOkMqXbMwvDxh4gX9vBJzXTSsko73Fx1Ona2RWYQ==',key_name='tempest-TestSecurityGroupsBasicOps-413146072',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:34:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-ixhib040',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:34:38Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=1701a941-088f-4d8d-99a0-3ab59e08de62,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.644 2 DEBUG nova.network.os_vif_util [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.645 2 DEBUG nova.network.os_vif_util [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.645 2 DEBUG os_vif [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.647 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape049c33e-0d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 podman[223428]: 2025-10-04 05:35:33.65120643 +0000 UTC m=+0.114592983 container died f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.652 2 INFO os_vif [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0c:3c:e1,bridge_name='br-int',has_traffic_filtering=True,id=e049c33e-0d6a-464d-99f6-ec92be78f298,network=Network(5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape049c33e-0d')#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.652 2 INFO nova.virt.libvirt.driver [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Deleting instance files /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62_del#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.653 2 INFO nova.virt.libvirt.driver [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Deletion of /var/lib/nova/instances/1701a941-088f-4d8d-99a0-3ab59e08de62_del complete#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.730 2 INFO nova.compute.manager [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.730 2 DEBUG oslo.service.loopingcall [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.731 2 DEBUG nova.compute.manager [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.731 2 DEBUG nova.network.neutron [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:35:33 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8-userdata-shm.mount: Deactivated successfully.
Oct  4 01:35:33 np0005470441 systemd[1]: var-lib-containers-storage-overlay-ca1e7dd48eecf0e837db8d75affe5b61062178b53d6e20962d134d763e61f2ad-merged.mount: Deactivated successfully.
Oct  4 01:35:33 np0005470441 podman[223428]: 2025-10-04 05:35:33.853218679 +0000 UTC m=+0.316605242 container cleanup f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:35:33 np0005470441 podman[223474]: 2025-10-04 05:35:33.913546496 +0000 UTC m=+0.039563007 container remove f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:35:33 np0005470441 systemd[1]: libpod-conmon-f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8.scope: Deactivated successfully.
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.919 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[896aea54-4652-4578-b66b-3fede8b20ba7]: (4, ('Sat Oct  4 05:35:33 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 (f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8)\nf13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8\nSat Oct  4 05:35:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 (f13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8)\nf13dc3b644d2c28aa3f9a78abbac39c91ba7fc6c5233f8ed59928cfbad8083d8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.921 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c7791add-0f06-49a4-ad8b-f48ab8b0abdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.923 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5097ec2e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 kernel: tap5097ec2e-e0: left promiscuous mode
Oct  4 01:35:33 np0005470441 nova_compute[192626]: 2025-10-04 05:35:33.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.941 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1adc389e-c732-4605-bdea-ec5310d8e3fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.976 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b32a29cf-17f1-4f06-8f1c-b31cbbe0a354]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.977 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[72b4976b-5204-48bd-ade0-a6218a0106e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.992 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[76d956e0-3d43-4027-a1c3-5971335de8e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 397434, 'reachable_time': 26078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223489, 'error': None, 'target': 'ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.995 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:35:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:33.995 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d7c62d-371c-496f-8ff5-1e918a9cbc1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:33 np0005470441 systemd[1]: run-netns-ovnmeta\x2d5097ec2e\x2def8d\x2d4aa1\x2dacd0\x2de3274eb4a8e0.mount: Deactivated successfully.
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.130 2 DEBUG nova.compute.manager [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-changed-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.131 2 DEBUG nova.compute.manager [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Refreshing instance network info cache due to event network-changed-e049c33e-0d6a-464d-99f6-ec92be78f298. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.131 2 DEBUG oslo_concurrency.lockutils [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.131 2 DEBUG oslo_concurrency.lockutils [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.131 2 DEBUG nova.network.neutron [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Refreshing network info cache for port e049c33e-0d6a-464d-99f6-ec92be78f298 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.591 2 DEBUG nova.network.neutron [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.610 2 INFO nova.compute.manager [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Took 0.88 seconds to deallocate network for instance.#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.674 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.675 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.745 2 DEBUG nova.compute.provider_tree [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.759 2 DEBUG nova.scheduler.client.report [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.784 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.810 2 INFO nova.scheduler.client.report [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Deleted allocations for instance 1701a941-088f-4d8d-99a0-3ab59e08de62#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.821 2 DEBUG nova.network.neutron [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Successfully updated port: 74297ebf-db77-4cdc-a627-f0123223bbd8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.835 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.835 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.835 2 DEBUG nova.network.neutron [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:35:34 np0005470441 nova_compute[192626]: 2025-10-04 05:35:34.898 2 DEBUG oslo_concurrency.lockutils [None req-194207c2-5de0-42cc-8839-0be532e97183 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.060 2 DEBUG nova.network.neutron [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.256 2 DEBUG nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-vif-unplugged-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.256 2 DEBUG oslo_concurrency.lockutils [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.257 2 DEBUG oslo_concurrency.lockutils [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.257 2 DEBUG oslo_concurrency.lockutils [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.257 2 DEBUG nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] No waiting events found dispatching network-vif-unplugged-e049c33e-0d6a-464d-99f6-ec92be78f298 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.258 2 WARNING nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received unexpected event network-vif-unplugged-e049c33e-0d6a-464d-99f6-ec92be78f298 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.258 2 DEBUG nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.258 2 DEBUG oslo_concurrency.lockutils [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.259 2 DEBUG oslo_concurrency.lockutils [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.259 2 DEBUG oslo_concurrency.lockutils [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1701a941-088f-4d8d-99a0-3ab59e08de62-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.259 2 DEBUG nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] No waiting events found dispatching network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.260 2 WARNING nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received unexpected event network-vif-plugged-e049c33e-0d6a-464d-99f6-ec92be78f298 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.260 2 DEBUG nova.compute.manager [req-86e30cac-3879-4975-814b-a59b3bb701c9 req-8c4016f2-ff0b-4689-bf61-546207c3d0df 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Received event network-vif-deleted-e049c33e-0d6a-464d-99f6-ec92be78f298 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.854 2 DEBUG nova.network.neutron [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updated VIF entry in instance network info cache for port e049c33e-0d6a-464d-99f6-ec92be78f298. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.855 2 DEBUG nova.network.neutron [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Updating instance_info_cache with network_info: [{"id": "e049c33e-0d6a-464d-99f6-ec92be78f298", "address": "fa:16:3e:0c:3c:e1", "network": {"id": "5097ec2e-ef8d-4aa1-acd0-e3274eb4a8e0", "bridge": "br-int", "label": "tempest-network-smoke--778012324", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape049c33e-0d", "ovs_interfaceid": "e049c33e-0d6a-464d-99f6-ec92be78f298", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:35:35 np0005470441 nova_compute[192626]: 2025-10-04 05:35:35.952 2 DEBUG oslo_concurrency.lockutils [req-ee8c57ff-a5ee-42f4-834d-b369d7853ec4 req-e83a42c3-eeff-4fd8-b013-6f649c557bc9 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1701a941-088f-4d8d-99a0-3ab59e08de62" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.499 2 DEBUG nova.compute.manager [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-changed-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.499 2 DEBUG nova.compute.manager [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Refreshing instance network info cache due to event network-changed-74297ebf-db77-4cdc-a627-f0123223bbd8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.500 2 DEBUG oslo_concurrency.lockutils [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.635 2 DEBUG nova.network.neutron [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updating instance_info_cache with network_info: [{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.663 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.663 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Instance network_info: |[{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.664 2 DEBUG oslo_concurrency.lockutils [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.664 2 DEBUG nova.network.neutron [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Refreshing network info cache for port 74297ebf-db77-4cdc-a627-f0123223bbd8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.667 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Start _get_guest_xml network_info=[{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.671 2 WARNING nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.678 2 DEBUG nova.virt.libvirt.host [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.678 2 DEBUG nova.virt.libvirt.host [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.682 2 DEBUG nova.virt.libvirt.host [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.683 2 DEBUG nova.virt.libvirt.host [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.684 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.684 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.685 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.685 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.686 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.686 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.686 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.687 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.687 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.687 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.688 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.688 2 DEBUG nova.virt.hardware [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.693 2 DEBUG nova.virt.libvirt.vif [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-232515109',display_name='tempest-TestNetworkBasicOps-server-232515109',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-232515109',id=17,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKXl/0n0maRedyncEWSafSqD2WwGB8Vqr91nng+d3xRMq22adOHq/udLYS3DSNPjzManhSKOloWbM/2YRVxEItwlVirx26joddP2R+wp4239FNQU8Fm+4331tnJcqVsp0A==',key_name='tempest-TestNetworkBasicOps-1787337635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-otz6u9lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:35:31Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=7b1055c2-f0e7-4493-a4cb-2fafa1519d27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.693 2 DEBUG nova.network.os_vif_util [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.694 2 DEBUG nova.network.os_vif_util [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.696 2 DEBUG nova.objects.instance [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.712 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <uuid>7b1055c2-f0e7-4493-a4cb-2fafa1519d27</uuid>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <name>instance-00000011</name>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-232515109</nova:name>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:35:36</nova:creationTime>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        <nova:port uuid="74297ebf-db77-4cdc-a627-f0123223bbd8">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <entry name="serial">7b1055c2-f0e7-4493-a4cb-2fafa1519d27</entry>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <entry name="uuid">7b1055c2-f0e7-4493-a4cb-2fafa1519d27</entry>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.config"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:19:8b:92"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <target dev="tap74297ebf-db"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/console.log" append="off"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:35:36 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:35:36 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:35:36 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:35:36 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.713 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Preparing to wait for external event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.713 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.714 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.714 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.715 2 DEBUG nova.virt.libvirt.vif [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-232515109',display_name='tempest-TestNetworkBasicOps-server-232515109',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-232515109',id=17,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKXl/0n0maRedyncEWSafSqD2WwGB8Vqr91nng+d3xRMq22adOHq/udLYS3DSNPjzManhSKOloWbM/2YRVxEItwlVirx26joddP2R+wp4239FNQU8Fm+4331tnJcqVsp0A==',key_name='tempest-TestNetworkBasicOps-1787337635',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-otz6u9lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:35:31Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=7b1055c2-f0e7-4493-a4cb-2fafa1519d27,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.715 2 DEBUG nova.network.os_vif_util [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.716 2 DEBUG nova.network.os_vif_util [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.716 2 DEBUG os_vif [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.717 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.718 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.720 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74297ebf-db, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap74297ebf-db, col_values=(('external_ids', {'iface-id': '74297ebf-db77-4cdc-a627-f0123223bbd8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:19:8b:92', 'vm-uuid': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:36 np0005470441 NetworkManager[51690]: <info>  [1759556136.7242] manager: (tap74297ebf-db): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.730 2 INFO os_vif [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db')#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.783 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.784 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.784 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:19:8b:92, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:35:36 np0005470441 nova_compute[192626]: 2025-10-04 05:35:36.784 2 INFO nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Using config drive#033[00m
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.246 2 INFO nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Creating config drive at /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.config#033[00m
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.250 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpevy4leee execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.380 2 DEBUG oslo_concurrency.processutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpevy4leee" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:37 np0005470441 kernel: tap74297ebf-db: entered promiscuous mode
Oct  4 01:35:37 np0005470441 NetworkManager[51690]: <info>  [1759556137.4493] manager: (tap74297ebf-db): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:37Z|00138|binding|INFO|Claiming lport 74297ebf-db77-4cdc-a627-f0123223bbd8 for this chassis.
Oct  4 01:35:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:37Z|00139|binding|INFO|74297ebf-db77-4cdc-a627-f0123223bbd8: Claiming fa:16:3e:19:8b:92 10.100.0.11
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.461 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:8b:92 10.100.0.11'], port_security=['fa:16:3e:19:8b:92 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a901e824-af59-4d0d-a85b-944b8499efe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cc616d4e-df4c-4d66-970c-f4a8e0cb5479', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2f4ce93-db77-4538-9009-9e50923ab602, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=74297ebf-db77-4cdc-a627-f0123223bbd8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.463 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 74297ebf-db77-4cdc-a627-f0123223bbd8 in datapath a901e824-af59-4d0d-a85b-944b8499efe5 bound to our chassis#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.465 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a901e824-af59-4d0d-a85b-944b8499efe5#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.475 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[125fca72-c556-493f-95fb-8f9ce450ba1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.476 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa901e824-a1 in ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.478 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa901e824-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.478 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0f610ef7-b997-44a0-91cd-150bbd3511d9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.479 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[da693cce-e049-4047-bd2a-97d16da91ac0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:37Z|00140|binding|INFO|Setting lport 74297ebf-db77-4cdc-a627-f0123223bbd8 ovn-installed in OVS
Oct  4 01:35:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:37Z|00141|binding|INFO|Setting lport 74297ebf-db77-4cdc-a627-f0123223bbd8 up in Southbound
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.494 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[d373ca8b-d05a-4486-8bf6-6a93de8f9087]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 systemd-udevd[223512]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:35:37 np0005470441 systemd-machined[152624]: New machine qemu-9-instance-00000011.
Oct  4 01:35:37 np0005470441 NetworkManager[51690]: <info>  [1759556137.5134] device (tap74297ebf-db): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:35:37 np0005470441 NetworkManager[51690]: <info>  [1759556137.5150] device (tap74297ebf-db): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:35:37 np0005470441 systemd[1]: Started Virtual Machine qemu-9-instance-00000011.
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.521 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[97625e8a-b17c-46d6-980d-319ed4fe908a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.554 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1943c95d-c750-4ad0-95c3-3fe82823ca7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.558 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0752ee12-926e-481b-8aa1-baba2029af4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 NetworkManager[51690]: <info>  [1759556137.5603] manager: (tapa901e824-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/64)
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.603 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[2c5cb533-2127-4961-99a7-f246f14c7546]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.606 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[951335bb-cb97-4ba2-8533-039f9b092b31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 NetworkManager[51690]: <info>  [1759556137.6285] device (tapa901e824-a0): carrier: link connected
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.633 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[65214308-d855-4413-a2d7-2c90270a5427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.649 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[54d56cac-7a27-46b1-a71b-ae3d8da38380]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa901e824-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:a9:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403528, 'reachable_time': 41401, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223543, 'error': None, 'target': 'ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.666 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9359b1b4-9865-4e85-a867-57af1311661b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:a9e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403528, 'tstamp': 403528}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223544, 'error': None, 'target': 'ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.682 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7828e6a9-7121-4666-a771-61d8de253ce4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa901e824-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:32:a9:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403528, 'reachable_time': 41401, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223545, 'error': None, 'target': 'ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.714 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2b4d96d3-3e75-4e2b-8f23-f7aa07582097]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.770 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[62fb0580-4488-40b8-8750-1f8b081a200f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.771 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa901e824-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.772 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.772 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa901e824-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:37 np0005470441 kernel: tapa901e824-a0: entered promiscuous mode
Oct  4 01:35:37 np0005470441 NetworkManager[51690]: <info>  [1759556137.7754] manager: (tapa901e824-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.777 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa901e824-a0, col_values=(('external_ids', {'iface-id': '2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:35:37 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:37Z|00142|binding|INFO|Releasing lport 2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a from this chassis (sb_readonly=0)
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.780 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a901e824-af59-4d0d-a85b-944b8499efe5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a901e824-af59-4d0d-a85b-944b8499efe5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.781 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4a78761c-b9ba-45b8-9010-17d06ebd6b2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.782 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-a901e824-af59-4d0d-a85b-944b8499efe5
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/a901e824-af59-4d0d-a85b-944b8499efe5.pid.haproxy
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID a901e824-af59-4d0d-a85b-944b8499efe5
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:35:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:35:37.783 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5', 'env', 'PROCESS_TAG=haproxy-a901e824-af59-4d0d-a85b-944b8499efe5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a901e824-af59-4d0d-a85b-944b8499efe5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:35:37 np0005470441 nova_compute[192626]: 2025-10-04 05:35:37.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:38 np0005470441 podman[223582]: 2025-10-04 05:35:38.135611337 +0000 UTC m=+0.024446587 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.234 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556138.234262, 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.235 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] VM Started (Lifecycle Event)#033[00m
Oct  4 01:35:38 np0005470441 podman[223582]: 2025-10-04 05:35:38.326747436 +0000 UTC m=+0.215582656 container create 1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.347 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.353 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556138.2343395, 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.354 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.378 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.381 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:35:38 np0005470441 systemd[1]: Started libpod-conmon-1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03.scope.
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.405 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:35:38 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:35:38 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cae679344b6103ac8a8172dd865a9ee0315793c7008493ce27927397e8499e5f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:35:38 np0005470441 podman[223582]: 2025-10-04 05:35:38.469161939 +0000 UTC m=+0.357997249 container init 1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001)
Oct  4 01:35:38 np0005470441 podman[223582]: 2025-10-04 05:35:38.475072688 +0000 UTC m=+0.363907908 container start 1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:35:38 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [NOTICE]   (223602) : New worker (223604) forked
Oct  4 01:35:38 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [NOTICE]   (223602) : Loading success.
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.685 2 DEBUG nova.compute.manager [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.685 2 DEBUG oslo_concurrency.lockutils [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.686 2 DEBUG oslo_concurrency.lockutils [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.686 2 DEBUG oslo_concurrency.lockutils [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.686 2 DEBUG nova.compute.manager [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Processing event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.686 2 DEBUG nova.compute.manager [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.687 2 DEBUG oslo_concurrency.lockutils [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.687 2 DEBUG oslo_concurrency.lockutils [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.687 2 DEBUG oslo_concurrency.lockutils [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.687 2 DEBUG nova.compute.manager [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] No waiting events found dispatching network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.688 2 WARNING nova.compute.manager [req-3611c407-8986-4738-b386-c8888409e1ef req-f884a98a-9067-4634-b67a-22dfcbdc8f07 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received unexpected event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 for instance with vm_state building and task_state spawning.#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.688 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.691 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556138.6915834, 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.692 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.693 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.698 2 INFO nova.virt.libvirt.driver [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Instance spawned successfully.#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.698 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.738 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.745 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.750 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.751 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.752 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.752 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.753 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.753 2 DEBUG nova.virt.libvirt.driver [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.775 2 DEBUG nova.network.neutron [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updated VIF entry in instance network info cache for port 74297ebf-db77-4cdc-a627-f0123223bbd8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.776 2 DEBUG nova.network.neutron [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updating instance_info_cache with network_info: [{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.782 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.823 2 DEBUG oslo_concurrency.lockutils [req-8b6342e0-5a52-4a7f-a2a9-e85908359376 req-d8fb73c9-9a79-46de-bf77-e3d2cf67d152 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.844 2 INFO nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Took 6.86 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.844 2 DEBUG nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.941 2 INFO nova.compute.manager [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Took 7.42 seconds to build instance.#033[00m
Oct  4 01:35:38 np0005470441 nova_compute[192626]: 2025-10-04 05:35:38.985 2 DEBUG oslo_concurrency.lockutils [None req-2e3a7985-babf-45f2-8209-124a06b305e1 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:39Z|00143|binding|INFO|Releasing lport 2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a from this chassis (sb_readonly=0)
Oct  4 01:35:39 np0005470441 nova_compute[192626]: 2025-10-04 05:35:39.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:39Z|00144|binding|INFO|Releasing lport 2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a from this chassis (sb_readonly=0)
Oct  4 01:35:39 np0005470441 nova_compute[192626]: 2025-10-04 05:35:39.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:41 np0005470441 nova_compute[192626]: 2025-10-04 05:35:41.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:41 np0005470441 nova_compute[192626]: 2025-10-04 05:35:41.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:41 np0005470441 NetworkManager[51690]: <info>  [1759556141.8987] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Oct  4 01:35:41 np0005470441 NetworkManager[51690]: <info>  [1759556141.8998] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct  4 01:35:41 np0005470441 nova_compute[192626]: 2025-10-04 05:35:41.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:41Z|00145|binding|INFO|Releasing lport 2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a from this chassis (sb_readonly=0)
Oct  4 01:35:41 np0005470441 nova_compute[192626]: 2025-10-04 05:35:41.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:42 np0005470441 podman[223616]: 2025-10-04 05:35:42.3591733 +0000 UTC m=+0.093484602 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:35:42 np0005470441 podman[223615]: 2025-10-04 05:35:42.364627175 +0000 UTC m=+0.101461078 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.388 2 DEBUG nova.compute.manager [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-changed-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.388 2 DEBUG nova.compute.manager [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Refreshing instance network info cache due to event network-changed-74297ebf-db77-4cdc-a627-f0123223bbd8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.388 2 DEBUG oslo_concurrency.lockutils [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.389 2 DEBUG oslo_concurrency.lockutils [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.389 2 DEBUG nova.network.neutron [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Refreshing network info cache for port 74297ebf-db77-4cdc-a627-f0123223bbd8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:42 np0005470441 nova_compute[192626]: 2025-10-04 05:35:42.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:45 np0005470441 nova_compute[192626]: 2025-10-04 05:35:45.069 2 DEBUG nova.network.neutron [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updated VIF entry in instance network info cache for port 74297ebf-db77-4cdc-a627-f0123223bbd8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:35:45 np0005470441 nova_compute[192626]: 2025-10-04 05:35:45.070 2 DEBUG nova.network.neutron [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updating instance_info_cache with network_info: [{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:35:45 np0005470441 nova_compute[192626]: 2025-10-04 05:35:45.093 2 DEBUG oslo_concurrency.lockutils [req-90412f1e-2dcf-412e-92c3-847ef3ffb949 req-a1a6de0b-15fc-4fec-923a-a2526d92b928 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:35:46 np0005470441 nova_compute[192626]: 2025-10-04 05:35:46.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:47 np0005470441 podman[223659]: 2025-10-04 05:35:47.305201904 +0000 UTC m=+0.058175087 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:35:47 np0005470441 podman[223658]: 2025-10-04 05:35:47.31349773 +0000 UTC m=+0.069527950 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:35:47 np0005470441 nova_compute[192626]: 2025-10-04 05:35:47.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:48 np0005470441 nova_compute[192626]: 2025-10-04 05:35:48.624 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556133.6234498, 1701a941-088f-4d8d-99a0-3ab59e08de62 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:35:48 np0005470441 nova_compute[192626]: 2025-10-04 05:35:48.625 2 INFO nova.compute.manager [-] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:35:48 np0005470441 nova_compute[192626]: 2025-10-04 05:35:48.697 2 DEBUG nova.compute.manager [None req-57457ca2-77e3-43bc-b036-798119c61a5a - - - - - -] [instance: 1701a941-088f-4d8d-99a0-3ab59e08de62] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:35:49 np0005470441 nova_compute[192626]: 2025-10-04 05:35:49.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:51 np0005470441 nova_compute[192626]: 2025-10-04 05:35:51.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:52Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:19:8b:92 10.100.0.11
Oct  4 01:35:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:35:52Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:19:8b:92 10.100.0.11
Oct  4 01:35:52 np0005470441 nova_compute[192626]: 2025-10-04 05:35:52.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:54 np0005470441 podman[223713]: 2025-10-04 05:35:54.323373203 +0000 UTC m=+0.067965175 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct  4 01:35:55 np0005470441 nova_compute[192626]: 2025-10-04 05:35:55.741 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:56 np0005470441 nova_compute[192626]: 2025-10-04 05:35:56.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:57 np0005470441 podman[223734]: 2025-10-04 05:35:57.306825622 +0000 UTC m=+0.056816828 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.865 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.866 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.866 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:57 np0005470441 nova_compute[192626]: 2025-10-04 05:35:57.866 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.070 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.128 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.129 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.183 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.334 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.336 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5547MB free_disk=73.43671035766602GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.336 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.336 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.426 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.427 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.427 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.476 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.499 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.525 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:35:58 np0005470441 nova_compute[192626]: 2025-10-04 05:35:58.526 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:35:59 np0005470441 nova_compute[192626]: 2025-10-04 05:35:59.026 2 INFO nova.compute.manager [None req-fcbbca08-c892-47f2-876f-48d9e3c5c855 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Get console output#033[00m
Oct  4 01:35:59 np0005470441 nova_compute[192626]: 2025-10-04 05:35:59.031 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:35:59 np0005470441 nova_compute[192626]: 2025-10-04 05:35:59.527 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:35:59 np0005470441 nova_compute[192626]: 2025-10-04 05:35:59.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:00 np0005470441 podman[223765]: 2025-10-04 05:36:00.309173019 +0000 UTC m=+0.057874118 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  4 01:36:00 np0005470441 nova_compute[192626]: 2025-10-04 05:36:00.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:00 np0005470441 nova_compute[192626]: 2025-10-04 05:36:00.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:36:00 np0005470441 nova_compute[192626]: 2025-10-04 05:36:00.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:36:01 np0005470441 nova_compute[192626]: 2025-10-04 05:36:01.127 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:36:01 np0005470441 nova_compute[192626]: 2025-10-04 05:36:01.128 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:36:01 np0005470441 nova_compute[192626]: 2025-10-04 05:36:01.128 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:36:01 np0005470441 nova_compute[192626]: 2025-10-04 05:36:01.128 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:36:01 np0005470441 nova_compute[192626]: 2025-10-04 05:36:01.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:02 np0005470441 nova_compute[192626]: 2025-10-04 05:36:02.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.710 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'name': 'tempest-TestNetworkBasicOps-server-232515109', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000011', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7ec39d6d697445438e79b0bfc666a027', 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'hostId': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.733 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.read.requests volume: 1060 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.733 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8c3c8aa-8999-48c5-b67d-8f705d290cdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1060, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.712241', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03ecf962-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '652d0da5f2f48bf24de4dca7097e97869b3e60deb34f38ac58df6680630746f7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.712241', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03ed07d6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': 'b2eced1d3358aa00d2f5b7d1bbe546598bef05f19113ab7f02cf2b26d2885a4a'}]}, 'timestamp': '2025-10-04 05:36:02.733988', '_unique_id': 'bd2857c8b7bb4084856c5dfd21401fef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.735 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.736 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.738 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 / tap74297ebf-db inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.738 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.outgoing.packets volume: 107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45be1b3c-a3dc-44f6-89a5-bcbba96f3440', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 107, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.736246', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03edc9fa-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': 'dae32ca6bef6ded93e31b98bf5c656ecbbf34a63c47750601339ecd938937762'}]}, 'timestamp': '2025-10-04 05:36:02.738962', '_unique_id': '34b2894a5d33406ea31574c1e265df60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.739 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.740 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.read.bytes volume: 29260288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.740 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '902aa935-78ba-44b2-9faa-797a17362fe7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29260288, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.740483', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03ee116c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '5a4af0892b635d60182b3f0792e1b8893c21efc82fe8c68aad81e69a180fcfa6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.740483', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03ee1b12-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': 'b81e0fd902c1c716243e79382abb78fb8e236f18b679ae9d804b06ceea87e5e8'}]}, 'timestamp': '2025-10-04 05:36:02.741017', '_unique_id': '81cb4c4385f24aaf8808dfb78cac505d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.741 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.742 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.742 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b2d0670-6c0f-4b75-b62a-8c70cf06d560', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.742481', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03ee5f3c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': '4869d4f8df7f5b9532736fd2341c0f94a52c7fa6ca2a5e7d3cc1ace36c2a3461'}]}, 'timestamp': '2025-10-04 05:36:02.742789', '_unique_id': '4f66773f8f804eab984dad4c2cc0096d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.743 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad9c4b4c-8152-456b-9f3c-cea548eaa373', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.744225', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03eea262-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': '6ff3e3a8e931f9d2447e998cca32ad39da5c74fd009ffa2ddaa4476d35f2be56'}]}, 'timestamp': '2025-10-04 05:36:02.744494', '_unique_id': '08103599260e4e8baee687879de35fd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.744 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.745 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.745 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.outgoing.bytes volume: 15898 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bffb80c2-f7df-48ee-aedb-a865c4cef6ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15898, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.745926', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03eee4de-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': '14cb4a7ab6a6797368b5a7e022440989a6fa45d54775f9099afaed02931a2303'}]}, 'timestamp': '2025-10-04 05:36:02.746196', '_unique_id': 'c45f7cf5dd544afc890c6a9bacca3c3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.746 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.747 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.747 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.747 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>]
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.747 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68bbdb2c-a214-42ac-b9d8-75b98fd8f224', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.748088', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03ef3934-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': 'd2c3db7d8cd43b9a8fc77729008fc08c88959ba698907c6dfa51cd970d97676f'}]}, 'timestamp': '2025-10-04 05:36:02.748355', '_unique_id': '8d557d650d7e4864ac9c1c0164aad6a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.748 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.749 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.766 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/memory.usage volume: 42.8359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e41814aa-f9b5-4136-84ff-0a427f3f94e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.8359375, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'timestamp': '2025-10-04T05:36:02.749822', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '03f2296e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.481419418, 'message_signature': '8a1cab75429536536438f25639cb99c0e0ec1fffb9ddded742516c7274216b41'}]}, 'timestamp': '2025-10-04 05:36:02.767743', '_unique_id': 'bc38a3611f4e41ef96b84b553460697e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.768 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.770 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.770 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/cpu volume: 12070000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28d0e2f8-adc8-4cfa-bf81-11cdc85c5282', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12070000000, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'timestamp': '2025-10-04T05:36:02.770262', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '03f29d72-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.481419418, 'message_signature': '86445a93b53cd24a361b7fa184c6ef4357c0f79512efa9498ff0222d15fdc8ce'}]}, 'timestamp': '2025-10-04 05:36:02.770644', '_unique_id': '6a1c2ddd6b254a96b762023f36adf396'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.771 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.772 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.772 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.write.bytes volume: 72908800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.772 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5735ef6-e732-45d5-a3a3-8c3dccfc6143', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72908800, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.772577', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f2f7d6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '0bc8bc9c9b7b6adf3de31c2577064b7bd155a35b3d71308f83d5d070fc0da974'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.772577', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f304d8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '196a3953aa23cd3d1ec704e5d3c0ea140d55e44e61610658a0682c67fe752789'}]}, 'timestamp': '2025-10-04 05:36:02.773264', '_unique_id': 'fa760ad8716b4b8c944840fe14e23dce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.773 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.775 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.775 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>]
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.775 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.775 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9711c17-e945-45df-bc1f-2bf0d4e1d2d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.775779', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03f37512-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': '77c55f6daec713a53adc35ef1a4ec25ea7210ac7fe27abd86c725e474a5fd595'}]}, 'timestamp': '2025-10-04 05:36:02.776150', '_unique_id': '402d3c216882471084ce74ebb1cad643'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.776 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.777 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.778 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.write.latency volume: 2151124442 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.778 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fc41d65-4f3b-43cb-a4f8-48e5b8c238b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2151124442, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.778139', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f3d426-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '77562e17c6474a77dd583c937d0274e657649a86c51a64633f12734fdb68467c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.778139', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f3e4ca-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '240cafa87243d8855d5434e81c474c718f020dfe344791eaef48f1d3d32e4c6e'}]}, 'timestamp': '2025-10-04 05:36:02.779000', '_unique_id': 'a4234edabf1f40b4a992d2acc1c94412'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.779 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.780 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.780 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.780 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>]
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.781 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.793 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.794 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30a47ea-824c-40a7-a556-348590141dcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.781065', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f63c2a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.495630222, 'message_signature': '6d382c777bc033d4e259b7481bedc6e8e583491c821fef6d337c2bd9b495de13'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.781065', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f64ada-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.495630222, 'message_signature': 'a694e21677d67bdc4ab7566ea0163296fa51f639fc895fcdfbcc40275c4a5ddc'}]}, 'timestamp': '2025-10-04 05:36:02.794718', '_unique_id': '9f0599dfaead4f8db17aadf34a0fad54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.795 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.796 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.incoming.packets volume: 107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '948f8e2d-2aa2-432e-a33d-bc09ce4f7503', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 107, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.796817', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03f6a8f4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': 'c1456de88d71f4ee2009d836a6c23805783cb426356da98bf66aa746011d3226'}]}, 'timestamp': '2025-10-04 05:36:02.797127', '_unique_id': '5d0c8c59840b48258b1874ec68feb409'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.797 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.800 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.800 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c8b4558-a611-4671-91a6-3d03d72e7e65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.800672', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f73e9a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.495630222, 'message_signature': 'a27cc6dd208a17d2bd7c19ffcd3c108b923881cbd4f2ab6cc87c9a79c502d0af'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.800672', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f74714-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.495630222, 'message_signature': '9195f023d60936aabec69f29223ae7b5f43c00176ef4462ceb5b25ab904dcad3'}]}, 'timestamp': '2025-10-04 05:36:02.801123', '_unique_id': 'ad1d7d95509e448c9c53283a7b51325d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.802 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.802 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.incoming.bytes volume: 19554 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfd8f513-68d1-4078-b850-3c66c31c915e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19554, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.802605', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03f78cba-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': '9bf1fd3abd1ce700dd250eda361fbdbed8737697c7ead7803c4e251d285222fd'}]}, 'timestamp': '2025-10-04 05:36:02.802924', '_unique_id': '8c3ad72cf4bc4d6f948d1d465aad9f95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.803 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.804 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.write.requests volume: 312 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.804 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6828563d-dd44-41d6-aca9-b167cf6367af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 312, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.804245', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f7c9e6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '7f6097e7a64fefb970b4cadb3ba442badc5f3f3b780642bcc4ccceab3a1219e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.804245', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f7d652-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': 'cbb7ab80674dda6ab5922fae74f23c7fe8a61730e05eccffd4e0e754b58bcd6c'}]}, 'timestamp': '2025-10-04 05:36:02.804847', '_unique_id': 'c8f0054da49342fd988cd50064fb924c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.805 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.806 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.806 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.806 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-232515109>]
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.806 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.806 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d954174-f4ed-40ea-b5d7-3fccd019ff5a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.806909', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03f831d8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': 'b8bd7a4e99a6541dd35d89ae127dc104fd14e580db2a78ab4e649c819bd66128'}]}, 'timestamp': '2025-10-04 05:36:02.807149', '_unique_id': '5bb76a0b6b7c46efa515a3ee655974f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.807 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.808 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.read.latency volume: 1458201730 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.808 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.read.latency volume: 60893982 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f68202fa-6713-4730-ae4c-0f2920374cc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1458201730, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.808356', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f86bf8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '13f89c819c7cb02f75590c4495fdd05ce684bef879afb77267362d26265c9dd9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60893982, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.808356', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f876e8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.426815804, 'message_signature': '2a0517ad16586a8f6321454572410774138c15d4ef354223026d005173f39ce8'}]}, 'timestamp': '2025-10-04 05:36:02.808937', '_unique_id': '2490e44f361b44abae256a76caa3eac8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.809 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.810 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.810 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b3e4274-8f84-425d-b9c7-c6ffb7656286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-vda', 'timestamp': '2025-10-04T05:36:02.810416', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '03f8bcca-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.495630222, 'message_signature': '133b81021007b7ee66728784e141edd30f426ba14b30f0b19d45c4c68eb15a65'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27-sda', 'timestamp': '2025-10-04T05:36:02.810416', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'instance-00000011', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '03f8c878-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.495630222, 'message_signature': '3579fd25528cfab079db4d21b333d23586820ab00d559bf9f6cf458f81c51008'}]}, 'timestamp': '2025-10-04 05:36:02.811025', '_unique_id': 'ed421d611ef24a03aef60d1cd2e19924'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.811 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.812 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.812 12 DEBUG ceilometer.compute.pollsters [-] 7b1055c2-f0e7-4493-a4cb-2fafa1519d27/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20cfa37c-9763-4817-9251-dfb3404d5677', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000011-7b1055c2-f0e7-4493-a4cb-2fafa1519d27-tap74297ebf-db', 'timestamp': '2025-10-04T05:36:02.812625', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-232515109', 'name': 'tap74297ebf-db', 'instance_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:19:8b:92', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap74297ebf-db'}, 'message_id': '03f91256-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4060.450805956, 'message_signature': 'e431a9967acdd43ee66f31737e3d3cfc29cce7ac5cdbef8c546d327896a91673'}]}, 'timestamp': '2025-10-04 05:36:02.812940', '_unique_id': '327736fa976540bd9f924e6736ebab83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:36:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:36:02.813 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:36:03 np0005470441 podman[223784]: 2025-10-04 05:36:03.360363487 +0000 UTC m=+0.100214093 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.253 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updating instance_info_cache with network_info: [{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.268 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.269 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.270 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.270 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.271 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:36:04 np0005470441 nova_compute[192626]: 2025-10-04 05:36:04.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:06 np0005470441 nova_compute[192626]: 2025-10-04 05:36:06.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:06.742 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:06.743 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:06.743 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:07 np0005470441 nova_compute[192626]: 2025-10-04 05:36:07.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:11 np0005470441 nova_compute[192626]: 2025-10-04 05:36:11.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:12 np0005470441 nova_compute[192626]: 2025-10-04 05:36:12.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:13 np0005470441 podman[223812]: 2025-10-04 05:36:13.304609711 +0000 UTC m=+0.056747786 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:36:13 np0005470441 podman[223811]: 2025-10-04 05:36:13.322386247 +0000 UTC m=+0.079265427 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:36:16 np0005470441 nova_compute[192626]: 2025-10-04 05:36:16.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:16.784 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:36:16 np0005470441 nova_compute[192626]: 2025-10-04 05:36:16.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:16.785 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:36:17 np0005470441 nova_compute[192626]: 2025-10-04 05:36:17.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:18 np0005470441 podman[223856]: 2025-10-04 05:36:18.302276235 +0000 UTC m=+0.059199726 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:36:18 np0005470441 podman[223857]: 2025-10-04 05:36:18.309028817 +0000 UTC m=+0.061721468 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.835 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.835 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.855 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.935 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.936 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.944 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:36:18 np0005470441 nova_compute[192626]: 2025-10-04 05:36:18.944 2 INFO nova.compute.claims [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.062 2 DEBUG nova.compute.provider_tree [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.078 2 DEBUG nova.scheduler.client.report [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.107 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.108 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.175 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.175 2 DEBUG nova.network.neutron [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.202 2 INFO nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.226 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.357 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.359 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.359 2 INFO nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Creating image(s)#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.360 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.360 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.361 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.375 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.432 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.434 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.434 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.452 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.506 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.507 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.549 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.550 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.550 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.578 2 DEBUG nova.policy [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.641 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.642 2 DEBUG nova.virt.disk.api [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.643 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.731 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.732 2 DEBUG nova.virt.disk.api [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.733 2 DEBUG nova.objects.instance [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.752 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.752 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Ensure instance console log exists: /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.753 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.753 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:19 np0005470441 nova_compute[192626]: 2025-10-04 05:36:19.753 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:19 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:19.787 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:20 np0005470441 nova_compute[192626]: 2025-10-04 05:36:20.521 2 DEBUG nova.network.neutron [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Successfully created port: 47a5f4a8-ec88-4a5f-a9d4-255266429b71 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.595 2 DEBUG nova.network.neutron [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Successfully updated port: 47a5f4a8-ec88-4a5f-a9d4-255266429b71 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.616 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.616 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.616 2 DEBUG nova.network.neutron [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.672 2 DEBUG nova.compute.manager [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-changed-47a5f4a8-ec88-4a5f-a9d4-255266429b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.673 2 DEBUG nova.compute.manager [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Refreshing instance network info cache due to event network-changed-47a5f4a8-ec88-4a5f-a9d4-255266429b71. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.673 2 DEBUG oslo_concurrency.lockutils [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:21 np0005470441 nova_compute[192626]: 2025-10-04 05:36:21.794 2 DEBUG nova.network.neutron [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.819 2 DEBUG nova.network.neutron [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Updating instance_info_cache with network_info: [{"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.971 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.972 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Instance network_info: |[{"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.972 2 DEBUG oslo_concurrency.lockutils [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.973 2 DEBUG nova.network.neutron [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Refreshing network info cache for port 47a5f4a8-ec88-4a5f-a9d4-255266429b71 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.975 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Start _get_guest_xml network_info=[{"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.979 2 WARNING nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.983 2 DEBUG nova.virt.libvirt.host [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.983 2 DEBUG nova.virt.libvirt.host [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.986 2 DEBUG nova.virt.libvirt.host [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.987 2 DEBUG nova.virt.libvirt.host [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.989 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.989 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.990 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.991 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.991 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.992 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.992 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.993 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.993 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.994 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.994 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:36:22 np0005470441 nova_compute[192626]: 2025-10-04 05:36:22.995 2 DEBUG nova.virt.hardware [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.001 2 DEBUG nova.virt.libvirt.vif [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1154513081',display_name='tempest-TestNetworkBasicOps-server-1154513081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1154513081',id=19,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBc2Q6aGE0NnI427OvTHfwEOWmMmCWerSDayUXMFRIWN8JjOMhSO/+Q4kwOq6rha2Svgy4r4GY+jvloBkShFseEa5NnoN77AhyA5J6ElXsKlt1n1a+RdvZdfuk3rDx0ZvA==',key_name='tempest-TestNetworkBasicOps-1012604518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-dqwaax58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:36:19Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=3dbf4d02-61da-4a02-a7d6-a8de4aafcd67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.002 2 DEBUG nova.network.os_vif_util [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.003 2 DEBUG nova.network.os_vif_util [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.005 2 DEBUG nova.objects.instance [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.089 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <uuid>3dbf4d02-61da-4a02-a7d6-a8de4aafcd67</uuid>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <name>instance-00000013</name>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-1154513081</nova:name>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:36:22</nova:creationTime>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        <nova:port uuid="47a5f4a8-ec88-4a5f-a9d4-255266429b71">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <entry name="serial">3dbf4d02-61da-4a02-a7d6-a8de4aafcd67</entry>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <entry name="uuid">3dbf4d02-61da-4a02-a7d6-a8de4aafcd67</entry>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.config"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:07:e1:95"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <target dev="tap47a5f4a8-ec"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/console.log" append="off"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:36:23 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:36:23 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:36:23 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:36:23 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.091 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Preparing to wait for external event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.091 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.092 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.092 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.094 2 DEBUG nova.virt.libvirt.vif [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1154513081',display_name='tempest-TestNetworkBasicOps-server-1154513081',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1154513081',id=19,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBc2Q6aGE0NnI427OvTHfwEOWmMmCWerSDayUXMFRIWN8JjOMhSO/+Q4kwOq6rha2Svgy4r4GY+jvloBkShFseEa5NnoN77AhyA5J6ElXsKlt1n1a+RdvZdfuk3rDx0ZvA==',key_name='tempest-TestNetworkBasicOps-1012604518',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-dqwaax58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:36:19Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=3dbf4d02-61da-4a02-a7d6-a8de4aafcd67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.094 2 DEBUG nova.network.os_vif_util [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.095 2 DEBUG nova.network.os_vif_util [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.096 2 DEBUG os_vif [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.097 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.104 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47a5f4a8-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.104 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47a5f4a8-ec, col_values=(('external_ids', {'iface-id': '47a5f4a8-ec88-4a5f-a9d4-255266429b71', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:07:e1:95', 'vm-uuid': '3dbf4d02-61da-4a02-a7d6-a8de4aafcd67'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 NetworkManager[51690]: <info>  [1759556183.1079] manager: (tap47a5f4a8-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.115 2 INFO os_vif [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec')#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.259 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.259 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.260 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:07:e1:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.260 2 INFO nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Using config drive#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.675 2 INFO nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Creating config drive at /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.config#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.679 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9p5xhkm5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.809 2 DEBUG oslo_concurrency.processutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9p5xhkm5" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:36:23 np0005470441 kernel: tap47a5f4a8-ec: entered promiscuous mode
Oct  4 01:36:23 np0005470441 NetworkManager[51690]: <info>  [1759556183.8622] manager: (tap47a5f4a8-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Oct  4 01:36:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:23Z|00146|binding|INFO|Claiming lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 for this chassis.
Oct  4 01:36:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:23Z|00147|binding|INFO|47a5f4a8-ec88-4a5f-a9d4-255266429b71: Claiming fa:16:3e:07:e1:95 10.100.0.29
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.873 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:e1:95 10.100.0.29'], port_security=['fa:16:3e:07:e1:95 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '3dbf4d02-61da-4a02-a7d6-a8de4aafcd67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11abf421-a0cf-4582-8538-480535fc1876', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': '619b7b37-fd77-49d8-b2d5-7d563c58baac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d275b4ef-408f-4950-9520-db5189359a16, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=47a5f4a8-ec88-4a5f-a9d4-255266429b71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.874 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 47a5f4a8-ec88-4a5f-a9d4-255266429b71 in datapath 11abf421-a0cf-4582-8538-480535fc1876 bound to our chassis#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.876 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 11abf421-a0cf-4582-8538-480535fc1876#033[00m
Oct  4 01:36:23 np0005470441 systemd-udevd[223928]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.888 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1e8726b1-e5cc-4cdd-a02b-acc6d775d627]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.889 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap11abf421-a1 in ovnmeta-11abf421-a0cf-4582-8538-480535fc1876 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.891 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap11abf421-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.891 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[654e40d9-bba8-491d-b91d-84207ecb67c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.893 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[019a4a7b-39bc-4672-b3f0-be3faf8843ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 NetworkManager[51690]: <info>  [1759556183.8998] device (tap47a5f4a8-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:36:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:23Z|00148|binding|INFO|Setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 ovn-installed in OVS
Oct  4 01:36:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:23Z|00149|binding|INFO|Setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 up in Southbound
Oct  4 01:36:23 np0005470441 NetworkManager[51690]: <info>  [1759556183.9011] device (tap47a5f4a8-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:36:23 np0005470441 nova_compute[192626]: 2025-10-04 05:36:23.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.904 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[fb66d1ee-d8ab-4984-98d9-1547c52018ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 systemd-machined[152624]: New machine qemu-10-instance-00000013.
Oct  4 01:36:23 np0005470441 systemd[1]: Started Virtual Machine qemu-10-instance-00000013.
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.928 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f0d36c43-9987-4a16-a33f-3209565dc18f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.960 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[76998ddb-c8c5-40e6-a689-6239b1737eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 systemd-udevd[223932]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:36:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.967 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c17b2f-7ec4-45f3-be2e-ced8e4a3fee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:23 np0005470441 NetworkManager[51690]: <info>  [1759556183.9707] manager: (tap11abf421-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:23.999 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[374ecc71-7c8c-4266-b7c4-c9a14e70e709]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.002 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[39e30879-825b-40e4-a24f-841ac57cbe5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 NetworkManager[51690]: <info>  [1759556184.0245] device (tap11abf421-a0): carrier: link connected
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.027 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1302470f-f031-4374-974c-d3c3289dace7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.047 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9eced7e9-f659-40cf-a0c9-bcbe6d3dc080]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11abf421-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:bc:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408168, 'reachable_time': 37346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223961, 'error': None, 'target': 'ovnmeta-11abf421-a0cf-4582-8538-480535fc1876', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.063 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6e451d0a-7282-4aae-90da-a3479ee4fc87]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:bc1e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408168, 'tstamp': 408168}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223962, 'error': None, 'target': 'ovnmeta-11abf421-a0cf-4582-8538-480535fc1876', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.080 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4eb61d-a662-4754-8c2c-32a21eac3595]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11abf421-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:bc:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 39], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408168, 'reachable_time': 37346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223963, 'error': None, 'target': 'ovnmeta-11abf421-a0cf-4582-8538-480535fc1876', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.113 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d95ee323-be96-4819-a836-dff8ccae9d70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.169 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[83e5551a-6dbf-46a4-bfbb-8720756b412f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.170 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11abf421-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.170 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.171 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11abf421-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:24 np0005470441 NetworkManager[51690]: <info>  [1759556184.1737] manager: (tap11abf421-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Oct  4 01:36:24 np0005470441 kernel: tap11abf421-a0: entered promiscuous mode
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.176 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap11abf421-a0, col_values=(('external_ids', {'iface-id': '2cb96b2f-7e3c-4381-82be-c2d8acfd4a9f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:24 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:24Z|00150|binding|INFO|Releasing lport 2cb96b2f-7e3c-4381-82be-c2d8acfd4a9f from this chassis (sb_readonly=0)
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.190 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/11abf421-a0cf-4582-8538-480535fc1876.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/11abf421-a0cf-4582-8538-480535fc1876.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.192 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[647d7a1f-342b-414e-ac2d-b141de141495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.193 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-11abf421-a0cf-4582-8538-480535fc1876
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/11abf421-a0cf-4582-8538-480535fc1876.pid.haproxy
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 11abf421-a0cf-4582-8538-480535fc1876
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:36:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:24.194 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-11abf421-a0cf-4582-8538-480535fc1876', 'env', 'PROCESS_TAG=haproxy-11abf421-a0cf-4582-8538-480535fc1876', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/11abf421-a0cf-4582-8538-480535fc1876.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.226 2 DEBUG nova.compute.manager [req-da17a94f-b920-4593-8b79-717e091a185c req-2d941526-743f-48ba-bb06-ef75d61ed47b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.227 2 DEBUG oslo_concurrency.lockutils [req-da17a94f-b920-4593-8b79-717e091a185c req-2d941526-743f-48ba-bb06-ef75d61ed47b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.227 2 DEBUG oslo_concurrency.lockutils [req-da17a94f-b920-4593-8b79-717e091a185c req-2d941526-743f-48ba-bb06-ef75d61ed47b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.228 2 DEBUG oslo_concurrency.lockutils [req-da17a94f-b920-4593-8b79-717e091a185c req-2d941526-743f-48ba-bb06-ef75d61ed47b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.228 2 DEBUG nova.compute.manager [req-da17a94f-b920-4593-8b79-717e091a185c req-2d941526-743f-48ba-bb06-ef75d61ed47b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Processing event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.345 2 DEBUG nova.network.neutron [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Updated VIF entry in instance network info cache for port 47a5f4a8-ec88-4a5f-a9d4-255266429b71. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.345 2 DEBUG nova.network.neutron [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Updating instance_info_cache with network_info: [{"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.375 2 DEBUG oslo_concurrency.lockutils [req-39f45126-dc75-4bbd-a3fb-ef2faa8bfa2b req-0cd6b51b-d3ee-4637-8617-bf405e230ea5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:36:24 np0005470441 podman[224001]: 2025-10-04 05:36:24.551918729 +0000 UTC m=+0.044288590 container create 09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  4 01:36:24 np0005470441 systemd[1]: Started libpod-conmon-09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678.scope.
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.590 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.592 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556184.5914078, 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.593 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] VM Started (Lifecycle Event)#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.596 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.600 2 INFO nova.virt.libvirt.driver [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Instance spawned successfully.#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.601 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:36:24 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:36:24 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f25e989eb2c7537ae7c923814c4069ab52fe0625c8bde7186ad94f5006d56304/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:36:24 np0005470441 podman[224001]: 2025-10-04 05:36:24.526841076 +0000 UTC m=+0.019210957 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.628 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:36:24 np0005470441 podman[224001]: 2025-10-04 05:36:24.630245905 +0000 UTC m=+0.122615796 container init 09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  4 01:36:24 np0005470441 podman[224001]: 2025-10-04 05:36:24.635710401 +0000 UTC m=+0.128080262 container start 09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.636 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.640 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.641 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.642 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.642 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.643 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.644 2 DEBUG nova.virt.libvirt.driver [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:36:24 np0005470441 podman[224015]: 2025-10-04 05:36:24.653259119 +0000 UTC m=+0.058345949 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:36:24 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [NOTICE]   (224039) : New worker (224045) forked
Oct  4 01:36:24 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [NOTICE]   (224039) : Loading success.
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.670 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.670 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556184.591491, 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.671 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.721 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.724 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556184.597904, 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.724 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.737 2 INFO nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Took 5.38 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.738 2 DEBUG nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.749 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.752 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.787 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.821 2 INFO nova.compute.manager [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Took 5.92 seconds to build instance.#033[00m
Oct  4 01:36:24 np0005470441 nova_compute[192626]: 2025-10-04 05:36:24.844 2 DEBUG oslo_concurrency.lockutils [None req-27d692a9-09e1-4af5-a4d6-2f3671819941 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:26 np0005470441 nova_compute[192626]: 2025-10-04 05:36:26.339 2 DEBUG nova.compute.manager [req-1380810b-9a10-4320-b192-2b1766ba4a70 req-042c5255-efb5-47bf-812e-4098494d7337 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:26 np0005470441 nova_compute[192626]: 2025-10-04 05:36:26.340 2 DEBUG oslo_concurrency.lockutils [req-1380810b-9a10-4320-b192-2b1766ba4a70 req-042c5255-efb5-47bf-812e-4098494d7337 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:26 np0005470441 nova_compute[192626]: 2025-10-04 05:36:26.341 2 DEBUG oslo_concurrency.lockutils [req-1380810b-9a10-4320-b192-2b1766ba4a70 req-042c5255-efb5-47bf-812e-4098494d7337 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:26 np0005470441 nova_compute[192626]: 2025-10-04 05:36:26.342 2 DEBUG oslo_concurrency.lockutils [req-1380810b-9a10-4320-b192-2b1766ba4a70 req-042c5255-efb5-47bf-812e-4098494d7337 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:26 np0005470441 nova_compute[192626]: 2025-10-04 05:36:26.342 2 DEBUG nova.compute.manager [req-1380810b-9a10-4320-b192-2b1766ba4a70 req-042c5255-efb5-47bf-812e-4098494d7337 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] No waiting events found dispatching network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:36:26 np0005470441 nova_compute[192626]: 2025-10-04 05:36:26.343 2 WARNING nova.compute.manager [req-1380810b-9a10-4320-b192-2b1766ba4a70 req-042c5255-efb5-47bf-812e-4098494d7337 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received unexpected event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:36:27 np0005470441 nova_compute[192626]: 2025-10-04 05:36:27.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:28 np0005470441 nova_compute[192626]: 2025-10-04 05:36:28.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:28 np0005470441 podman[224054]: 2025-10-04 05:36:28.290135102 +0000 UTC m=+0.043272671 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:36:30 np0005470441 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  4 01:36:30 np0005470441 podman[224081]: 2025-10-04 05:36:30.584214318 +0000 UTC m=+0.063455554 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Oct  4 01:36:32 np0005470441 nova_compute[192626]: 2025-10-04 05:36:32.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:33 np0005470441 nova_compute[192626]: 2025-10-04 05:36:33.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:34 np0005470441 podman[224102]: 2025-10-04 05:36:34.358312012 +0000 UTC m=+0.109032070 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:36:37 np0005470441 nova_compute[192626]: 2025-10-04 05:36:37.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:38 np0005470441 nova_compute[192626]: 2025-10-04 05:36:38.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:38 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:38Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:07:e1:95 10.100.0.29
Oct  4 01:36:38 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:38Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:07:e1:95 10.100.0.29
Oct  4 01:36:42 np0005470441 nova_compute[192626]: 2025-10-04 05:36:42.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:43 np0005470441 nova_compute[192626]: 2025-10-04 05:36:43.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:44 np0005470441 podman[224146]: 2025-10-04 05:36:44.298421785 +0000 UTC m=+0.050719543 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:36:44 np0005470441 podman[224145]: 2025-10-04 05:36:44.335030115 +0000 UTC m=+0.086295273 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:36:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:45Z|00151|binding|INFO|Releasing lport 2cb96b2f-7e3c-4381-82be-c2d8acfd4a9f from this chassis (sb_readonly=0)
Oct  4 01:36:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:45Z|00152|binding|INFO|Releasing lport 2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a from this chassis (sb_readonly=0)
Oct  4 01:36:45 np0005470441 nova_compute[192626]: 2025-10-04 05:36:45.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.874 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.875 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.876 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.877 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.877 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.880 2 INFO nova.compute.manager [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Terminating instance#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.881 2 DEBUG nova.compute.manager [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:36:47 np0005470441 kernel: tap47a5f4a8-ec (unregistering): left promiscuous mode
Oct  4 01:36:47 np0005470441 NetworkManager[51690]: <info>  [1759556207.9056] device (tap47a5f4a8-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:47Z|00153|binding|INFO|Releasing lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 from this chassis (sb_readonly=0)
Oct  4 01:36:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:47Z|00154|binding|INFO|Setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 down in Southbound
Oct  4 01:36:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:47Z|00155|binding|INFO|Removing iface tap47a5f4a8-ec ovn-installed in OVS
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:47 np0005470441 nova_compute[192626]: 2025-10-04 05:36:47.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:47.961 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:e1:95 10.100.0.29'], port_security=['fa:16:3e:07:e1:95 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '3dbf4d02-61da-4a02-a7d6-a8de4aafcd67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11abf421-a0cf-4582-8538-480535fc1876', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': '619b7b37-fd77-49d8-b2d5-7d563c58baac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d275b4ef-408f-4950-9520-db5189359a16, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=47a5f4a8-ec88-4a5f-a9d4-255266429b71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:36:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:47.965 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 47a5f4a8-ec88-4a5f-a9d4-255266429b71 in datapath 11abf421-a0cf-4582-8538-480535fc1876 unbound from our chassis#033[00m
Oct  4 01:36:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:47.968 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11abf421-a0cf-4582-8538-480535fc1876, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:36:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:47.969 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[deba22ce-3cda-41b9-883b-2807ea1655ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:47.970 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-11abf421-a0cf-4582-8538-480535fc1876 namespace which is not needed anymore#033[00m
Oct  4 01:36:48 np0005470441 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000013.scope: Deactivated successfully.
Oct  4 01:36:48 np0005470441 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000013.scope: Consumed 13.286s CPU time.
Oct  4 01:36:48 np0005470441 systemd-machined[152624]: Machine qemu-10-instance-00000013 terminated.
Oct  4 01:36:48 np0005470441 kernel: tap47a5f4a8-ec: entered promiscuous mode
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 systemd-udevd[224193]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00156|binding|INFO|Claiming lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 for this chassis.
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00157|binding|INFO|47a5f4a8-ec88-4a5f-a9d4-255266429b71: Claiming fa:16:3e:07:e1:95 10.100.0.29
Oct  4 01:36:48 np0005470441 NetworkManager[51690]: <info>  [1759556208.1124] manager: (tap47a5f4a8-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 kernel: tap47a5f4a8-ec (unregistering): left promiscuous mode
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.118 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:e1:95 10.100.0.29'], port_security=['fa:16:3e:07:e1:95 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '3dbf4d02-61da-4a02-a7d6-a8de4aafcd67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11abf421-a0cf-4582-8538-480535fc1876', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': '619b7b37-fd77-49d8-b2d5-7d563c58baac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d275b4ef-408f-4950-9520-db5189359a16, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=47a5f4a8-ec88-4a5f-a9d4-255266429b71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:36:48 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [NOTICE]   (224039) : haproxy version is 2.8.14-c23fe91
Oct  4 01:36:48 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [NOTICE]   (224039) : path to executable is /usr/sbin/haproxy
Oct  4 01:36:48 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [WARNING]  (224039) : Exiting Master process...
Oct  4 01:36:48 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [ALERT]    (224039) : Current worker (224045) exited with code 143 (Terminated)
Oct  4 01:36:48 np0005470441 neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876[224018]: [WARNING]  (224039) : All workers exited. Exiting... (0)
Oct  4 01:36:48 np0005470441 systemd[1]: libpod-09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678.scope: Deactivated successfully.
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00158|binding|INFO|Setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 ovn-installed in OVS
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00159|binding|INFO|Setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 up in Southbound
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00160|binding|INFO|Releasing lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 from this chassis (sb_readonly=1)
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00161|if_status|INFO|Dropped 2 log messages in last 235 seconds (most recently, 235 seconds ago) due to excessive rate
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00162|if_status|INFO|Not setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 down as sb is readonly
Oct  4 01:36:48 np0005470441 podman[224212]: 2025-10-04 05:36:48.150014821 +0000 UTC m=+0.070428393 container died 09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00163|binding|INFO|Removing iface tap47a5f4a8-ec ovn-installed in OVS
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00164|binding|INFO|Releasing lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 from this chassis (sb_readonly=0)
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:48Z|00165|binding|INFO|Setting lport 47a5f4a8-ec88-4a5f-a9d4-255266429b71 down in Southbound
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.157 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:07:e1:95 10.100.0.29'], port_security=['fa:16:3e:07:e1:95 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': '3dbf4d02-61da-4a02-a7d6-a8de4aafcd67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11abf421-a0cf-4582-8538-480535fc1876', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': '619b7b37-fd77-49d8-b2d5-7d563c58baac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d275b4ef-408f-4950-9520-db5189359a16, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=47a5f4a8-ec88-4a5f-a9d4-255266429b71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.176 2 INFO nova.virt.libvirt.driver [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Instance destroyed successfully.#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.177 2 DEBUG nova.objects.instance [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:36:48 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678-userdata-shm.mount: Deactivated successfully.
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.192 2 DEBUG nova.virt.libvirt.vif [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:36:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1154513081',display_name='tempest-TestNetworkBasicOps-server-1154513081',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1154513081',id=19,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBc2Q6aGE0NnI427OvTHfwEOWmMmCWerSDayUXMFRIWN8JjOMhSO/+Q4kwOq6rha2Svgy4r4GY+jvloBkShFseEa5NnoN77AhyA5J6ElXsKlt1n1a+RdvZdfuk3rDx0ZvA==',key_name='tempest-TestNetworkBasicOps-1012604518',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:36:24Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-dqwaax58',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:36:24Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=3dbf4d02-61da-4a02-a7d6-a8de4aafcd67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.193 2 DEBUG nova.network.os_vif_util [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "address": "fa:16:3e:07:e1:95", "network": {"id": "11abf421-a0cf-4582-8538-480535fc1876", "bridge": "br-int", "label": "tempest-network-smoke--1342527831", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47a5f4a8-ec", "ovs_interfaceid": "47a5f4a8-ec88-4a5f-a9d4-255266429b71", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.194 2 DEBUG nova.network.os_vif_util [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.195 2 DEBUG os_vif [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:36:48 np0005470441 systemd[1]: var-lib-containers-storage-overlay-f25e989eb2c7537ae7c923814c4069ab52fe0625c8bde7186ad94f5006d56304-merged.mount: Deactivated successfully.
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.197 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47a5f4a8-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 podman[224212]: 2025-10-04 05:36:48.200335011 +0000 UTC m=+0.120748553 container cleanup 09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3)
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.204 2 INFO os_vif [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:07:e1:95,bridge_name='br-int',has_traffic_filtering=True,id=47a5f4a8-ec88-4a5f-a9d4-255266429b71,network=Network(11abf421-a0cf-4582-8538-480535fc1876),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47a5f4a8-ec')#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.204 2 INFO nova.virt.libvirt.driver [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Deleting instance files /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67_del#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.205 2 INFO nova.virt.libvirt.driver [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Deletion of /var/lib/nova/instances/3dbf4d02-61da-4a02-a7d6-a8de4aafcd67_del complete#033[00m
Oct  4 01:36:48 np0005470441 systemd[1]: libpod-conmon-09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678.scope: Deactivated successfully.
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.259 2 INFO nova.compute.manager [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.260 2 DEBUG oslo.service.loopingcall [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.261 2 DEBUG nova.compute.manager [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.261 2 DEBUG nova.network.neutron [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:36:48 np0005470441 podman[224253]: 2025-10-04 05:36:48.269199059 +0000 UTC m=+0.043272671 container remove 09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.273 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbe2503-ead3-4973-9245-72a03330ef59]: (4, ('Sat Oct  4 05:36:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876 (09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678)\n09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678\nSat Oct  4 05:36:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-11abf421-a0cf-4582-8538-480535fc1876 (09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678)\n09516ecb3838b5ab13ef587cc43c87f4da283fb138129efaf21b0efb51796678\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.275 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[472140ef-75bc-49f2-9adb-d73f4d5d9cdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.276 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11abf421-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 kernel: tap11abf421-a0: left promiscuous mode
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.292 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[26a34aaf-6f9e-46f3-bf5f-29fd3220cd51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.313 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7d7dfb59-800c-44fb-91fc-d323f893708e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.314 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a57755-6158-4910-afba-326c38b832ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.332 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ef709b9c-6331-4469-b530-907ac92ef23b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408160, 'reachable_time': 37174, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224268, 'error': None, 'target': 'ovnmeta-11abf421-a0cf-4582-8538-480535fc1876', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.334 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-11abf421-a0cf-4582-8538-480535fc1876 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.334 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[d86b3c62-9c00-4973-9577-0f584c9f9598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.335 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 47a5f4a8-ec88-4a5f-a9d4-255266429b71 in datapath 11abf421-a0cf-4582-8538-480535fc1876 unbound from our chassis#033[00m
Oct  4 01:36:48 np0005470441 systemd[1]: run-netns-ovnmeta\x2d11abf421\x2da0cf\x2d4582\x2d8538\x2d480535fc1876.mount: Deactivated successfully.
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.336 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11abf421-a0cf-4582-8538-480535fc1876, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.336 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[840b3da7-e039-421e-9f87-0c46ddae80cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.337 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 47a5f4a8-ec88-4a5f-a9d4-255266429b71 in datapath 11abf421-a0cf-4582-8538-480535fc1876 unbound from our chassis#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.338 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11abf421-a0cf-4582-8538-480535fc1876, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:36:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:48.338 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5707b65e-f295-4b9f-97cd-99b6c32095b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:48 np0005470441 podman[224269]: 2025-10-04 05:36:48.437800771 +0000 UTC m=+0.065807211 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:36:48 np0005470441 podman[224270]: 2025-10-04 05:36:48.464612223 +0000 UTC m=+0.095815424 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.745 2 DEBUG nova.compute.manager [req-70537f6a-4637-4774-8226-b89e308d691c req-066c63e7-7ff3-4fc9-8cb5-5bae0e04b93b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-vif-unplugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.746 2 DEBUG oslo_concurrency.lockutils [req-70537f6a-4637-4774-8226-b89e308d691c req-066c63e7-7ff3-4fc9-8cb5-5bae0e04b93b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.746 2 DEBUG oslo_concurrency.lockutils [req-70537f6a-4637-4774-8226-b89e308d691c req-066c63e7-7ff3-4fc9-8cb5-5bae0e04b93b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.746 2 DEBUG oslo_concurrency.lockutils [req-70537f6a-4637-4774-8226-b89e308d691c req-066c63e7-7ff3-4fc9-8cb5-5bae0e04b93b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.746 2 DEBUG nova.compute.manager [req-70537f6a-4637-4774-8226-b89e308d691c req-066c63e7-7ff3-4fc9-8cb5-5bae0e04b93b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] No waiting events found dispatching network-vif-unplugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:36:48 np0005470441 nova_compute[192626]: 2025-10-04 05:36:48.746 2 DEBUG nova.compute.manager [req-70537f6a-4637-4774-8226-b89e308d691c req-066c63e7-7ff3-4fc9-8cb5-5bae0e04b93b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-vif-unplugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.814 2 DEBUG nova.network.neutron [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.849 2 INFO nova.compute.manager [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Took 1.59 seconds to deallocate network for instance.#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.940 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.940 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.973 2 DEBUG nova.scheduler.client.report [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.991 2 DEBUG nova.scheduler.client.report [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:36:49 np0005470441 nova_compute[192626]: 2025-10-04 05:36:49.992 2 DEBUG nova.compute.provider_tree [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.012 2 DEBUG nova.scheduler.client.report [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.049 2 DEBUG nova.scheduler.client.report [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.111 2 DEBUG nova.compute.provider_tree [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.130 2 DEBUG nova.scheduler.client.report [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.212 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.284 2 INFO nova.scheduler.client.report [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.445 2 DEBUG oslo_concurrency.lockutils [None req-9be75746-ff7f-41e3-8d87-ef277ca56f98 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:50Z|00166|binding|INFO|Releasing lport 2f74c1e8-e0e9-433c-ab2c-0f15024fcd1a from this chassis (sb_readonly=0)
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.948 2 DEBUG nova.compute.manager [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.949 2 DEBUG oslo_concurrency.lockutils [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.949 2 DEBUG oslo_concurrency.lockutils [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.949 2 DEBUG oslo_concurrency.lockutils [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "3dbf4d02-61da-4a02-a7d6-a8de4aafcd67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.950 2 DEBUG nova.compute.manager [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] No waiting events found dispatching network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.950 2 WARNING nova.compute.manager [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received unexpected event network-vif-plugged-47a5f4a8-ec88-4a5f-a9d4-255266429b71 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:36:50 np0005470441 nova_compute[192626]: 2025-10-04 05:36:50.950 2 DEBUG nova.compute.manager [req-cf93bd43-5e71-43d3-92e0-5fc96185e146 req-0bd09a06-8725-4d94-9e3c-65ec927b5b23 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Received event network-vif-deleted-47a5f4a8-ec88-4a5f-a9d4-255266429b71 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:52 np0005470441 nova_compute[192626]: 2025-10-04 05:36:52.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:53 np0005470441 nova_compute[192626]: 2025-10-04 05:36:53.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 podman[224308]: 2025-10-04 05:36:55.303264013 +0000 UTC m=+0.061351974 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container)
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.301 2 DEBUG nova.compute.manager [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-changed-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.302 2 DEBUG nova.compute.manager [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Refreshing instance network info cache due to event network-changed-74297ebf-db77-4cdc-a627-f0123223bbd8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.302 2 DEBUG oslo_concurrency.lockutils [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.302 2 DEBUG oslo_concurrency.lockutils [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.302 2 DEBUG nova.network.neutron [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Refreshing network info cache for port 74297ebf-db77-4cdc-a627-f0123223bbd8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.418 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.418 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.419 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.419 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.419 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.420 2 INFO nova.compute.manager [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Terminating instance#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.421 2 DEBUG nova.compute.manager [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:36:55 np0005470441 kernel: tap74297ebf-db (unregistering): left promiscuous mode
Oct  4 01:36:55 np0005470441 NetworkManager[51690]: <info>  [1759556215.4435] device (tap74297ebf-db): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:55Z|00167|binding|INFO|Releasing lport 74297ebf-db77-4cdc-a627-f0123223bbd8 from this chassis (sb_readonly=0)
Oct  4 01:36:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:55Z|00168|binding|INFO|Setting lport 74297ebf-db77-4cdc-a627-f0123223bbd8 down in Southbound
Oct  4 01:36:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:36:55Z|00169|binding|INFO|Removing iface tap74297ebf-db ovn-installed in OVS
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.462 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:19:8b:92 10.100.0.11'], port_security=['fa:16:3e:19:8b:92 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '7b1055c2-f0e7-4493-a4cb-2fafa1519d27', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a901e824-af59-4d0d-a85b-944b8499efe5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cc616d4e-df4c-4d66-970c-f4a8e0cb5479', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f2f4ce93-db77-4538-9009-9e50923ab602, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=74297ebf-db77-4cdc-a627-f0123223bbd8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.463 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 74297ebf-db77-4cdc-a627-f0123223bbd8 in datapath a901e824-af59-4d0d-a85b-944b8499efe5 unbound from our chassis#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.464 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a901e824-af59-4d0d-a85b-944b8499efe5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.465 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3030fcfc-2678-4bd5-8384-9da407acb011]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.466 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5 namespace which is not needed anymore#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Deactivated successfully.
Oct  4 01:36:55 np0005470441 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000011.scope: Consumed 16.120s CPU time.
Oct  4 01:36:55 np0005470441 systemd-machined[152624]: Machine qemu-9-instance-00000011 terminated.
Oct  4 01:36:55 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [NOTICE]   (223602) : haproxy version is 2.8.14-c23fe91
Oct  4 01:36:55 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [NOTICE]   (223602) : path to executable is /usr/sbin/haproxy
Oct  4 01:36:55 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [WARNING]  (223602) : Exiting Master process...
Oct  4 01:36:55 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [ALERT]    (223602) : Current worker (223604) exited with code 143 (Terminated)
Oct  4 01:36:55 np0005470441 neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5[223598]: [WARNING]  (223602) : All workers exited. Exiting... (0)
Oct  4 01:36:55 np0005470441 systemd[1]: libpod-1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03.scope: Deactivated successfully.
Oct  4 01:36:55 np0005470441 podman[224352]: 2025-10-04 05:36:55.607271495 +0000 UTC m=+0.045701640 container died 1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:36:55 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03-userdata-shm.mount: Deactivated successfully.
Oct  4 01:36:55 np0005470441 systemd[1]: var-lib-containers-storage-overlay-cae679344b6103ac8a8172dd865a9ee0315793c7008493ce27927397e8499e5f-merged.mount: Deactivated successfully.
Oct  4 01:36:55 np0005470441 podman[224352]: 2025-10-04 05:36:55.6623402 +0000 UTC m=+0.100770345 container cleanup 1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 systemd[1]: libpod-conmon-1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03.scope: Deactivated successfully.
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.695 2 INFO nova.virt.libvirt.driver [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Instance destroyed successfully.#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.695 2 DEBUG nova.objects.instance [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:55 np0005470441 podman[224394]: 2025-10-04 05:36:55.72849999 +0000 UTC m=+0.043365033 container remove 1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.733 2 DEBUG nova.virt.libvirt.vif [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:35:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-232515109',display_name='tempest-TestNetworkBasicOps-server-232515109',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-232515109',id=17,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKXl/0n0maRedyncEWSafSqD2WwGB8Vqr91nng+d3xRMq22adOHq/udLYS3DSNPjzManhSKOloWbM/2YRVxEItwlVirx26joddP2R+wp4239FNQU8Fm+4331tnJcqVsp0A==',key_name='tempest-TestNetworkBasicOps-1787337635',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:35:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-otz6u9lu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:35:38Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=7b1055c2-f0e7-4493-a4cb-2fafa1519d27,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.733 2 DEBUG nova.network.os_vif_util [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.734 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[05b41be0-cbaa-4368-8c0c-b9d529d69f4f]: (4, ('Sat Oct  4 05:36:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5 (1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03)\n1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03\nSat Oct  4 05:36:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5 (1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03)\n1afbf35a3ff463e9c78e8f1695375a213ad76f67009316070f77528ae1519b03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.734 2 DEBUG nova.network.os_vif_util [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.734 2 DEBUG os_vif [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.735 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[08a9a8ff-42dd-4661-bf33-5a4721011664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.736 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74297ebf-db, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.737 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa901e824-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 kernel: tapa901e824-a0: left promiscuous mode
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.753 2 INFO os_vif [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:19:8b:92,bridge_name='br-int',has_traffic_filtering=True,id=74297ebf-db77-4cdc-a627-f0123223bbd8,network=Network(a901e824-af59-4d0d-a85b-944b8499efe5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap74297ebf-db')#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.753 2 INFO nova.virt.libvirt.driver [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Deleting instance files /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27_del#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.754 2 INFO nova.virt.libvirt.driver [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Deletion of /var/lib/nova/instances/7b1055c2-f0e7-4493-a4cb-2fafa1519d27_del complete#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.753 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[868d7133-705f-44f6-9874-6836371cb3c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.781 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7c54770c-e576-48dc-9051-b048992e5a31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.783 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e7de3bd5-a777-4060-bb24-075a836a1e5b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.797 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[34954e11-c71f-4c25-aa5c-a63505508d68]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403520, 'reachable_time': 30962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224413, 'error': None, 'target': 'ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.798 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a901e824-af59-4d0d-a85b-944b8499efe5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:36:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:36:55.799 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[1519cb5f-f1b8-4863-a850-645f15659ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:36:55 np0005470441 systemd[1]: run-netns-ovnmeta\x2da901e824\x2daf59\x2d4d0d\x2da85b\x2d944b8499efe5.mount: Deactivated successfully.
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.842 2 INFO nova.compute.manager [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.843 2 DEBUG oslo.service.loopingcall [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.843 2 DEBUG nova.compute.manager [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:36:55 np0005470441 nova_compute[192626]: 2025-10-04 05:36:55.843 2 DEBUG nova.network.neutron [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:36:56 np0005470441 nova_compute[192626]: 2025-10-04 05:36:56.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.386 2 DEBUG nova.network.neutron [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.408 2 INFO nova.compute.manager [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Took 1.56 seconds to deallocate network for instance.#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.459 2 DEBUG nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-vif-unplugged-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.460 2 DEBUG oslo_concurrency.lockutils [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.460 2 DEBUG oslo_concurrency.lockutils [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.461 2 DEBUG oslo_concurrency.lockutils [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.461 2 DEBUG nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] No waiting events found dispatching network-vif-unplugged-74297ebf-db77-4cdc-a627-f0123223bbd8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.462 2 DEBUG nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-vif-unplugged-74297ebf-db77-4cdc-a627-f0123223bbd8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.462 2 DEBUG nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.463 2 DEBUG oslo_concurrency.lockutils [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.463 2 DEBUG oslo_concurrency.lockutils [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.463 2 DEBUG oslo_concurrency.lockutils [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.464 2 DEBUG nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] No waiting events found dispatching network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.464 2 WARNING nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received unexpected event network-vif-plugged-74297ebf-db77-4cdc-a627-f0123223bbd8 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.464 2 DEBUG nova.compute.manager [req-a8787e93-3543-4e36-aa72-bb7b4a3fa800 req-674abd4e-e244-4522-9fc7-9ec142f0f851 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Received event network-vif-deleted-74297ebf-db77-4cdc-a627-f0123223bbd8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.493 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.494 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.571 2 DEBUG nova.compute.provider_tree [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.601 2 DEBUG nova.scheduler.client.report [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.650 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.700 2 INFO nova.scheduler.client.report [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 7b1055c2-f0e7-4493-a4cb-2fafa1519d27#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.735 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:57 np0005470441 nova_compute[192626]: 2025-10-04 05:36:57.805 2 DEBUG oslo_concurrency.lockutils [None req-0fada1f7-397e-4bfb-bd34-39b5eb9b67fe b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "7b1055c2-f0e7-4493-a4cb-2fafa1519d27" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:58 np0005470441 nova_compute[192626]: 2025-10-04 05:36:58.286 2 DEBUG nova.network.neutron [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updated VIF entry in instance network info cache for port 74297ebf-db77-4cdc-a627-f0123223bbd8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:36:58 np0005470441 nova_compute[192626]: 2025-10-04 05:36:58.286 2 DEBUG nova.network.neutron [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Updating instance_info_cache with network_info: [{"id": "74297ebf-db77-4cdc-a627-f0123223bbd8", "address": "fa:16:3e:19:8b:92", "network": {"id": "a901e824-af59-4d0d-a85b-944b8499efe5", "bridge": "br-int", "label": "tempest-network-smoke--741498463", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap74297ebf-db", "ovs_interfaceid": "74297ebf-db77-4cdc-a627-f0123223bbd8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:36:58 np0005470441 nova_compute[192626]: 2025-10-04 05:36:58.316 2 DEBUG oslo_concurrency.lockutils [req-b9cfb517-4184-485a-bec6-f3c9d06f14c0 req-20d02fbf-81d6-4199-8b1a-36245161dea1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-7b1055c2-f0e7-4493-a4cb-2fafa1519d27" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:36:59 np0005470441 podman[224414]: 2025-10-04 05:36:59.310282827 +0000 UTC m=+0.063477475 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.746 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.747 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.747 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.747 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.925 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.927 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5775MB free_disk=73.46553039550781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.927 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.927 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.996 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:36:59 np0005470441 nova_compute[192626]: 2025-10-04 05:36:59.997 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:37:00 np0005470441 nova_compute[192626]: 2025-10-04 05:37:00.028 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:37:00 np0005470441 nova_compute[192626]: 2025-10-04 05:37:00.050 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:37:00 np0005470441 nova_compute[192626]: 2025-10-04 05:37:00.078 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:37:00 np0005470441 nova_compute[192626]: 2025-10-04 05:37:00.078 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:00 np0005470441 nova_compute[192626]: 2025-10-04 05:37:00.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.077 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.078 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:37:01 np0005470441 podman[224440]: 2025-10-04 05:37:01.313976249 +0000 UTC m=+0.057638189 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.735 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:37:01 np0005470441 nova_compute[192626]: 2025-10-04 05:37:01.736 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:02 np0005470441 nova_compute[192626]: 2025-10-04 05:37:02.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:03 np0005470441 nova_compute[192626]: 2025-10-04 05:37:03.176 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556208.173676, 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:03 np0005470441 nova_compute[192626]: 2025-10-04 05:37:03.177 2 INFO nova.compute.manager [-] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:37:03 np0005470441 nova_compute[192626]: 2025-10-04 05:37:03.196 2 DEBUG nova.compute.manager [None req-18785454-3b51-4056-937e-e63a23043d02 - - - - - -] [instance: 3dbf4d02-61da-4a02-a7d6-a8de4aafcd67] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:03 np0005470441 nova_compute[192626]: 2025-10-04 05:37:03.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:03 np0005470441 nova_compute[192626]: 2025-10-04 05:37:03.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:04 np0005470441 nova_compute[192626]: 2025-10-04 05:37:04.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:05 np0005470441 nova_compute[192626]: 2025-10-04 05:37:05.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:05 np0005470441 podman[224460]: 2025-10-04 05:37:05.340864556 +0000 UTC m=+0.095427404 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:37:05 np0005470441 nova_compute[192626]: 2025-10-04 05:37:05.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:06 np0005470441 nova_compute[192626]: 2025-10-04 05:37:06.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:06.743 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:06.744 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:06.744 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:07 np0005470441 nova_compute[192626]: 2025-10-04 05:37:07.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.425 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.426 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.474 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.658 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.659 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.667 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.668 2 INFO nova.compute.claims [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.693 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556215.6929493, 7b1055c2-f0e7-4493-a4cb-2fafa1519d27 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.694 2 INFO nova.compute.manager [-] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.744 2 DEBUG nova.compute.manager [None req-caee92ba-0d75-4077-8277-fbdbcce83dbe - - - - - -] [instance: 7b1055c2-f0e7-4493-a4cb-2fafa1519d27] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.854 2 DEBUG nova.compute.provider_tree [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.878 2 DEBUG nova.scheduler.client.report [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.954 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:10 np0005470441 nova_compute[192626]: 2025-10-04 05:37:10.955 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.010 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.011 2 DEBUG nova.network.neutron [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.031 2 INFO nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.051 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.151 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.154 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.154 2 INFO nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Creating image(s)#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.155 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "/var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.156 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.157 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.181 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.263 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.264 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.265 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.277 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.330 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.331 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.426 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk 1073741824" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.428 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.428 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.500 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.502 2 DEBUG nova.virt.disk.api [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Checking if we can resize image /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.502 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.565 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.567 2 DEBUG nova.virt.disk.api [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Cannot resize image /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.567 2 DEBUG nova.objects.instance [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 63d3b3b4-199b-4587-b7de-ed358aad629f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.601 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.602 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Ensure instance console log exists: /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.603 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.603 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.603 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:11 np0005470441 nova_compute[192626]: 2025-10-04 05:37:11.887 2 DEBUG nova.policy [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:37:12 np0005470441 nova_compute[192626]: 2025-10-04 05:37:12.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:13 np0005470441 nova_compute[192626]: 2025-10-04 05:37:13.436 2 DEBUG nova.network.neutron [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Successfully created port: 1727c984-f918-4a8a-880e-628b50e8dc5e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.828 2 DEBUG nova.network.neutron [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Successfully updated port: 1727c984-f918-4a8a-880e-628b50e8dc5e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.845 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.845 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquired lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.845 2 DEBUG nova.network.neutron [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.997 2 DEBUG nova.compute.manager [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-changed-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.998 2 DEBUG nova.compute.manager [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Refreshing instance network info cache due to event network-changed-1727c984-f918-4a8a-880e-628b50e8dc5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:37:14 np0005470441 nova_compute[192626]: 2025-10-04 05:37:14.998 2 DEBUG oslo_concurrency.lockutils [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:15 np0005470441 nova_compute[192626]: 2025-10-04 05:37:15.062 2 DEBUG nova.network.neutron [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:37:15 np0005470441 podman[224503]: 2025-10-04 05:37:15.308652947 +0000 UTC m=+0.057881496 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:37:15 np0005470441 podman[224502]: 2025-10-04 05:37:15.314283987 +0000 UTC m=+0.064598947 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:37:15 np0005470441 nova_compute[192626]: 2025-10-04 05:37:15.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.477 2 DEBUG nova.network.neutron [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updating instance_info_cache with network_info: [{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.507 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Releasing lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.508 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Instance network_info: |[{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.508 2 DEBUG oslo_concurrency.lockutils [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.508 2 DEBUG nova.network.neutron [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Refreshing network info cache for port 1727c984-f918-4a8a-880e-628b50e8dc5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.511 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Start _get_guest_xml network_info=[{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.515 2 WARNING nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.520 2 DEBUG nova.virt.libvirt.host [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.521 2 DEBUG nova.virt.libvirt.host [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.525 2 DEBUG nova.virt.libvirt.host [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.526 2 DEBUG nova.virt.libvirt.host [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.527 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.527 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.528 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.528 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.528 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.528 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.528 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.529 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.529 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.529 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.529 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.529 2 DEBUG nova.virt.hardware [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.533 2 DEBUG nova.virt.libvirt.vif [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:37:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=21,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPPCvh+0/27NPK7QaxbSf6tdbrt2mkjg4O77jk1dvScbusi+fO6V+f49FYOZoIGUQZbEjipPwmFy8iFtu29oLYkaX7Gx2Y5gOhEiJLT+or3V0Du75PLkE3/5tfCC7NsmQ==',key_name='tempest-TestSecurityGroupsBasicOps-517847362',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-6ef3w186',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:37:11Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=63d3b3b4-199b-4587-b7de-ed358aad629f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.533 2 DEBUG nova.network.os_vif_util [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.533 2 DEBUG nova.network.os_vif_util [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.534 2 DEBUG nova.objects.instance [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 63d3b3b4-199b-4587-b7de-ed358aad629f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.546 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <uuid>63d3b3b4-199b-4587-b7de-ed358aad629f</uuid>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <name>instance-00000015</name>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727</nova:name>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:37:16</nova:creationTime>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:user uuid="560c2ee221db4d87b04080584e8f0a48">tempest-TestSecurityGroupsBasicOps-1075539829-project-member</nova:user>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:project uuid="2eaa5fc2c08b415c8c98103e044fc0a3">tempest-TestSecurityGroupsBasicOps-1075539829</nova:project>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        <nova:port uuid="1727c984-f918-4a8a-880e-628b50e8dc5e">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <entry name="serial">63d3b3b4-199b-4587-b7de-ed358aad629f</entry>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <entry name="uuid">63d3b3b4-199b-4587-b7de-ed358aad629f</entry>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.config"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:b2:e2:f4"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <target dev="tap1727c984-f9"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/console.log" append="off"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:37:16 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:37:16 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:37:16 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:37:16 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.548 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Preparing to wait for external event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.549 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.550 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.551 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.552 2 DEBUG nova.virt.libvirt.vif [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:37:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=21,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPPCvh+0/27NPK7QaxbSf6tdbrt2mkjg4O77jk1dvScbusi+fO6V+f49FYOZoIGUQZbEjipPwmFy8iFtu29oLYkaX7Gx2Y5gOhEiJLT+or3V0Du75PLkE3/5tfCC7NsmQ==',key_name='tempest-TestSecurityGroupsBasicOps-517847362',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-6ef3w186',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:37:11Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=63d3b3b4-199b-4587-b7de-ed358aad629f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.552 2 DEBUG nova.network.os_vif_util [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.553 2 DEBUG nova.network.os_vif_util [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.553 2 DEBUG os_vif [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.554 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.555 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1727c984-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1727c984-f9, col_values=(('external_ids', {'iface-id': '1727c984-f918-4a8a-880e-628b50e8dc5e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:e2:f4', 'vm-uuid': '63d3b3b4-199b-4587-b7de-ed358aad629f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:16 np0005470441 NetworkManager[51690]: <info>  [1759556236.5612] manager: (tap1727c984-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.566 2 INFO os_vif [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9')#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.776 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.776 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.777 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No VIF found with MAC fa:16:3e:b2:e2:f4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.777 2 INFO nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Using config drive#033[00m
Oct  4 01:37:16 np0005470441 nova_compute[192626]: 2025-10-04 05:37:16.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:16.965 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:37:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:16.966 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.204 2 INFO nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Creating config drive at /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.config#033[00m
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.210 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptoc652y9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.341 2 DEBUG oslo_concurrency.processutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptoc652y9" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:17 np0005470441 kernel: tap1727c984-f9: entered promiscuous mode
Oct  4 01:37:17 np0005470441 NetworkManager[51690]: <info>  [1759556237.4218] manager: (tap1727c984-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Oct  4 01:37:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:17Z|00170|binding|INFO|Claiming lport 1727c984-f918-4a8a-880e-628b50e8dc5e for this chassis.
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:17Z|00171|binding|INFO|1727c984-f918-4a8a-880e-628b50e8dc5e: Claiming fa:16:3e:b2:e2:f4 10.100.0.11
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.442 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:e2:f4 10.100.0.11'], port_security=['fa:16:3e:b2:e2:f4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0ec5187-5c01-49fd-b367-066aab190f52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7972d4d7-da89-4cf4-80f9-17e8ab47d731 e3bb3dd7-6212-4c62-8755-dee3eaf8206a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b964dfd-53a7-4031-b952-1172eab348bf, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1727c984-f918-4a8a-880e-628b50e8dc5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.443 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1727c984-f918-4a8a-880e-628b50e8dc5e in datapath d0ec5187-5c01-49fd-b367-066aab190f52 bound to our chassis#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.444 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d0ec5187-5c01-49fd-b367-066aab190f52#033[00m
Oct  4 01:37:17 np0005470441 systemd-udevd[224562]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.455 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f48a2206-ecf2-4613-aeae-766d80025bb2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.456 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd0ec5187-51 in ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.458 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd0ec5187-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.458 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e61bd8ca-18df-45a0-a335-70a07435d4aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.459 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[590c0e16-9fa1-41f7-89e7-00cbd6225351]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 NetworkManager[51690]: <info>  [1759556237.4718] device (tap1727c984-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:37:17 np0005470441 NetworkManager[51690]: <info>  [1759556237.4737] device (tap1727c984-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.472 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[4f92c073-afd3-4492-82e1-c5e9f23055ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 systemd-machined[152624]: New machine qemu-11-instance-00000015.
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.511 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[947db0a7-880b-4cef-a40a-3bc8ffecabfb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 systemd[1]: Started Virtual Machine qemu-11-instance-00000015.
Oct  4 01:37:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:17Z|00172|binding|INFO|Setting lport 1727c984-f918-4a8a-880e-628b50e8dc5e ovn-installed in OVS
Oct  4 01:37:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:17Z|00173|binding|INFO|Setting lport 1727c984-f918-4a8a-880e-628b50e8dc5e up in Southbound
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.544 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c3bf22ce-1e3d-4044-b343-4379214eda39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 NetworkManager[51690]: <info>  [1759556237.5501] manager: (tapd0ec5187-50): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.551 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[46c9e239-3964-4f15-9da9-ab4d21d1597f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.580 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[41621d3e-e1bb-4ab4-b404-3189058f441d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.583 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[9001d28a-0082-43c8-8a10-ba435ed5c9ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 NetworkManager[51690]: <info>  [1759556237.6069] device (tapd0ec5187-50): carrier: link connected
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.612 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[23e85f66-e892-496e-a649-0052d325ee90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.627 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[458d2b91-053f-47a9-b7df-9365b737d421]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd0ec5187-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:76:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413526, 'reachable_time': 39012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224598, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.642 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a356da28-cf72-4ae2-9399-9ecac53793d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:763b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413526, 'tstamp': 413526}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224599, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.655 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9f4035e2-f1e1-44f3-865c-ebef0417bf20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd0ec5187-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:76:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413526, 'reachable_time': 39012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224600, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.677 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[19c26c77-e5ed-43b3-98ef-50617f4b7399]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.720 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ea1df73d-22dc-47a8-9c5e-481d5284dc24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.722 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0ec5187-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.722 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.722 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0ec5187-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:17 np0005470441 kernel: tapd0ec5187-50: entered promiscuous mode
Oct  4 01:37:17 np0005470441 NetworkManager[51690]: <info>  [1759556237.7245] manager: (tapd0ec5187-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.726 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd0ec5187-50, col_values=(('external_ids', {'iface-id': 'd1ccffa3-50a5-4834-b2c0-455085feb4e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:17Z|00174|binding|INFO|Releasing lport d1ccffa3-50a5-4834-b2c0-455085feb4e8 from this chassis (sb_readonly=0)
Oct  4 01:37:17 np0005470441 nova_compute[192626]: 2025-10-04 05:37:17.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.739 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d0ec5187-5c01-49fd-b367-066aab190f52.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d0ec5187-5c01-49fd-b367-066aab190f52.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.740 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e70f7adf-f905-4ced-9186-e719079e976d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.740 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-d0ec5187-5c01-49fd-b367-066aab190f52
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/d0ec5187-5c01-49fd-b367-066aab190f52.pid.haproxy
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID d0ec5187-5c01-49fd-b367-066aab190f52
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:37:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:17.741 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'env', 'PROCESS_TAG=haproxy-d0ec5187-5c01-49fd-b367-066aab190f52', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d0ec5187-5c01-49fd-b367-066aab190f52.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:37:18 np0005470441 podman[224640]: 2025-10-04 05:37:18.072026112 +0000 UTC m=+0.050398704 container create 7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  4 01:37:18 np0005470441 systemd[1]: Started libpod-conmon-7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac.scope.
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.127 2 DEBUG nova.compute.manager [req-de4434ff-ae9a-454a-8322-c1e4e1fa0d48 req-ff656fec-a51b-49a0-a7ce-55d8a1d9a2ce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.129 2 DEBUG oslo_concurrency.lockutils [req-de4434ff-ae9a-454a-8322-c1e4e1fa0d48 req-ff656fec-a51b-49a0-a7ce-55d8a1d9a2ce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.129 2 DEBUG oslo_concurrency.lockutils [req-de4434ff-ae9a-454a-8322-c1e4e1fa0d48 req-ff656fec-a51b-49a0-a7ce-55d8a1d9a2ce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.129 2 DEBUG oslo_concurrency.lockutils [req-de4434ff-ae9a-454a-8322-c1e4e1fa0d48 req-ff656fec-a51b-49a0-a7ce-55d8a1d9a2ce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.129 2 DEBUG nova.compute.manager [req-de4434ff-ae9a-454a-8322-c1e4e1fa0d48 req-ff656fec-a51b-49a0-a7ce-55d8a1d9a2ce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Processing event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:37:18 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:37:18 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e393fbab0af4df3b297b478223af3cb3e0b48db512a189d8bf531fa3bf8fd81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:37:18 np0005470441 podman[224640]: 2025-10-04 05:37:18.044187511 +0000 UTC m=+0.022560123 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:37:18 np0005470441 podman[224640]: 2025-10-04 05:37:18.150390929 +0000 UTC m=+0.128763551 container init 7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:37:18 np0005470441 podman[224640]: 2025-10-04 05:37:18.155713071 +0000 UTC m=+0.134085663 container start 7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:37:18 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [NOTICE]   (224659) : New worker (224661) forked
Oct  4 01:37:18 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [NOTICE]   (224659) : Loading success.
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.327 2 DEBUG nova.network.neutron [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updated VIF entry in instance network info cache for port 1727c984-f918-4a8a-880e-628b50e8dc5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.328 2 DEBUG nova.network.neutron [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updating instance_info_cache with network_info: [{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.344 2 DEBUG oslo_concurrency.lockutils [req-18780512-bffd-49b2-a9fd-961ecd1a07d4 req-8616b81d-b89a-4e8c-a1a4-2f6eb0bc254a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.395 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.395 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556238.3946488, 63d3b3b4-199b-4587-b7de-ed358aad629f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.396 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] VM Started (Lifecycle Event)#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.398 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.400 2 INFO nova.virt.libvirt.driver [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Instance spawned successfully.#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.401 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.419 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.424 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.426 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.427 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.427 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.427 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.428 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.428 2 DEBUG nova.virt.libvirt.driver [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.456 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.457 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556238.3975763, 63d3b3b4-199b-4587-b7de-ed358aad629f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.457 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.493 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.495 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556238.398371, 63d3b3b4-199b-4587-b7de-ed358aad629f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.495 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.505 2 INFO nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Took 7.35 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.505 2 DEBUG nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.515 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.519 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.555 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.589 2 INFO nova.compute.manager [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Took 7.97 seconds to build instance.#033[00m
Oct  4 01:37:18 np0005470441 nova_compute[192626]: 2025-10-04 05:37:18.607 2 DEBUG oslo_concurrency.lockutils [None req-81213bc2-d13e-42cd-859c-63396250ea98 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:19 np0005470441 podman[224671]: 2025-10-04 05:37:19.306655075 +0000 UTC m=+0.055269822 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:37:19 np0005470441 podman[224670]: 2025-10-04 05:37:19.3163471 +0000 UTC m=+0.064969527 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:37:20 np0005470441 nova_compute[192626]: 2025-10-04 05:37:20.341 2 DEBUG nova.compute.manager [req-d538cf58-618d-4329-94af-8955cc7116e3 req-73467f50-d29e-4588-a17b-db38ae3f03a5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:20 np0005470441 nova_compute[192626]: 2025-10-04 05:37:20.341 2 DEBUG oslo_concurrency.lockutils [req-d538cf58-618d-4329-94af-8955cc7116e3 req-73467f50-d29e-4588-a17b-db38ae3f03a5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:20 np0005470441 nova_compute[192626]: 2025-10-04 05:37:20.342 2 DEBUG oslo_concurrency.lockutils [req-d538cf58-618d-4329-94af-8955cc7116e3 req-73467f50-d29e-4588-a17b-db38ae3f03a5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:20 np0005470441 nova_compute[192626]: 2025-10-04 05:37:20.342 2 DEBUG oslo_concurrency.lockutils [req-d538cf58-618d-4329-94af-8955cc7116e3 req-73467f50-d29e-4588-a17b-db38ae3f03a5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:20 np0005470441 nova_compute[192626]: 2025-10-04 05:37:20.343 2 DEBUG nova.compute.manager [req-d538cf58-618d-4329-94af-8955cc7116e3 req-73467f50-d29e-4588-a17b-db38ae3f03a5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] No waiting events found dispatching network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:37:20 np0005470441 nova_compute[192626]: 2025-10-04 05:37:20.343 2 WARNING nova.compute.manager [req-d538cf58-618d-4329-94af-8955cc7116e3 req-73467f50-d29e-4588-a17b-db38ae3f03a5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received unexpected event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e for instance with vm_state active and task_state None.#033[00m
Oct  4 01:37:21 np0005470441 nova_compute[192626]: 2025-10-04 05:37:21.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:21 np0005470441 NetworkManager[51690]: <info>  [1759556241.9116] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct  4 01:37:21 np0005470441 NetworkManager[51690]: <info>  [1759556241.9122] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct  4 01:37:21 np0005470441 nova_compute[192626]: 2025-10-04 05:37:21.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:22 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:22Z|00175|binding|INFO|Releasing lport d1ccffa3-50a5-4834-b2c0-455085feb4e8 from this chassis (sb_readonly=0)
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.941 2 DEBUG nova.compute.manager [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-changed-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.941 2 DEBUG nova.compute.manager [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Refreshing instance network info cache due to event network-changed-1727c984-f918-4a8a-880e-628b50e8dc5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.942 2 DEBUG oslo_concurrency.lockutils [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.942 2 DEBUG oslo_concurrency.lockutils [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:22 np0005470441 nova_compute[192626]: 2025-10-04 05:37:22.942 2 DEBUG nova.network.neutron [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Refreshing network info cache for port 1727c984-f918-4a8a-880e-628b50e8dc5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:37:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:22.968 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:26 np0005470441 nova_compute[192626]: 2025-10-04 05:37:26.037 2 DEBUG nova.network.neutron [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updated VIF entry in instance network info cache for port 1727c984-f918-4a8a-880e-628b50e8dc5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:37:26 np0005470441 nova_compute[192626]: 2025-10-04 05:37:26.038 2 DEBUG nova.network.neutron [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updating instance_info_cache with network_info: [{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:26 np0005470441 nova_compute[192626]: 2025-10-04 05:37:26.080 2 DEBUG oslo_concurrency.lockutils [req-ab0790c6-891a-406b-8239-9dc216e9228d req-cc6e2575-6c9f-418d-b811-7ee9943272e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:26 np0005470441 podman[224712]: 2025-10-04 05:37:26.33236743 +0000 UTC m=+0.077533465 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:37:26 np0005470441 nova_compute[192626]: 2025-10-04 05:37:26.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:27 np0005470441 nova_compute[192626]: 2025-10-04 05:37:27.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:30Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b2:e2:f4 10.100.0.11
Oct  4 01:37:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:30Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b2:e2:f4 10.100.0.11
Oct  4 01:37:30 np0005470441 podman[224749]: 2025-10-04 05:37:30.300502337 +0000 UTC m=+0.053234664 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:37:31 np0005470441 nova_compute[192626]: 2025-10-04 05:37:31.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:32 np0005470441 podman[224774]: 2025-10-04 05:37:32.30524628 +0000 UTC m=+0.063592699 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Oct  4 01:37:32 np0005470441 nova_compute[192626]: 2025-10-04 05:37:32.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:36 np0005470441 podman[224793]: 2025-10-04 05:37:36.399375789 +0000 UTC m=+0.147524554 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:37:36 np0005470441 nova_compute[192626]: 2025-10-04 05:37:36.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:37 np0005470441 nova_compute[192626]: 2025-10-04 05:37:37.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:41 np0005470441 nova_compute[192626]: 2025-10-04 05:37:41.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:42 np0005470441 nova_compute[192626]: 2025-10-04 05:37:42.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.634 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "e1b1174b-3b77-4182-ae71-92ba2e7be833" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.634 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.650 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.735 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.736 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.746 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.747 2 INFO nova.compute.claims [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.885 2 DEBUG nova.compute.provider_tree [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.898 2 DEBUG nova.scheduler.client.report [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.922 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.923 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.964 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.964 2 DEBUG nova.network.neutron [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:37:45 np0005470441 nova_compute[192626]: 2025-10-04 05:37:45.992 2 INFO nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.012 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.123 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.125 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.126 2 INFO nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Creating image(s)#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.126 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "/var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.127 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.128 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.154 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.253 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.255 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.255 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.272 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:46 np0005470441 podman[224820]: 2025-10-04 05:37:46.31129678 +0000 UTC m=+0.060252794 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:37:46 np0005470441 podman[224821]: 2025-10-04 05:37:46.311800384 +0000 UTC m=+0.057868456 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.353 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.354 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.393 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.394 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.394 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.461 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.462 2 DEBUG nova.virt.disk.api [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Checking if we can resize image /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.463 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.519 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.520 2 DEBUG nova.virt.disk.api [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Cannot resize image /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.521 2 DEBUG nova.objects.instance [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'migration_context' on Instance uuid e1b1174b-3b77-4182-ae71-92ba2e7be833 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.533 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.534 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Ensure instance console log exists: /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.534 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.534 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.535 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:46 np0005470441 nova_compute[192626]: 2025-10-04 05:37:46.644 2 DEBUG nova.policy [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:37:47 np0005470441 nova_compute[192626]: 2025-10-04 05:37:47.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:47 np0005470441 nova_compute[192626]: 2025-10-04 05:37:47.709 2 DEBUG nova.network.neutron [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Successfully created port: 226d2c95-4b74-49b7-881e-e404dba21326 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.857 2 DEBUG nova.network.neutron [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Successfully updated port: 226d2c95-4b74-49b7-881e-e404dba21326 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.884 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.884 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquired lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.885 2 DEBUG nova.network.neutron [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.953 2 DEBUG nova.compute.manager [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received event network-changed-226d2c95-4b74-49b7-881e-e404dba21326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.954 2 DEBUG nova.compute.manager [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Refreshing instance network info cache due to event network-changed-226d2c95-4b74-49b7-881e-e404dba21326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:37:48 np0005470441 nova_compute[192626]: 2025-10-04 05:37:48.955 2 DEBUG oslo_concurrency.lockutils [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:49 np0005470441 nova_compute[192626]: 2025-10-04 05:37:49.085 2 DEBUG nova.network.neutron [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.161 2 DEBUG nova.network.neutron [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updating instance_info_cache with network_info: [{"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.187 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Releasing lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.188 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Instance network_info: |[{"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.188 2 DEBUG oslo_concurrency.lockutils [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.189 2 DEBUG nova.network.neutron [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Refreshing network info cache for port 226d2c95-4b74-49b7-881e-e404dba21326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.193 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Start _get_guest_xml network_info=[{"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.199 2 WARNING nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.204 2 DEBUG nova.virt.libvirt.host [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.205 2 DEBUG nova.virt.libvirt.host [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.208 2 DEBUG nova.virt.libvirt.host [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.209 2 DEBUG nova.virt.libvirt.host [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.210 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.211 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.211 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.212 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.212 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.212 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.213 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.213 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.213 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.214 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.214 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.215 2 DEBUG nova.virt.hardware [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.220 2 DEBUG nova.virt.libvirt.vif [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:37:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ge',id=23,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPPCvh+0/27NPK7QaxbSf6tdbrt2mkjg4O77jk1dvScbusi+fO6V+f49FYOZoIGUQZbEjipPwmFy8iFtu29oLYkaX7Gx2Y5gOhEiJLT+or3V0Du75PLkE3/5tfCC7NsmQ==',key_name='tempest-TestSecurityGroupsBasicOps-517847362',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-3now9jj7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:37:46Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=e1b1174b-3b77-4182-ae71-92ba2e7be833,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.221 2 DEBUG nova.network.os_vif_util [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.222 2 DEBUG nova.network.os_vif_util [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.223 2 DEBUG nova.objects.instance [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid e1b1174b-3b77-4182-ae71-92ba2e7be833 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.251 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <uuid>e1b1174b-3b77-4182-ae71-92ba2e7be833</uuid>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <name>instance-00000017</name>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671</nova:name>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:37:50</nova:creationTime>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:user uuid="560c2ee221db4d87b04080584e8f0a48">tempest-TestSecurityGroupsBasicOps-1075539829-project-member</nova:user>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:project uuid="2eaa5fc2c08b415c8c98103e044fc0a3">tempest-TestSecurityGroupsBasicOps-1075539829</nova:project>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        <nova:port uuid="226d2c95-4b74-49b7-881e-e404dba21326">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <entry name="serial">e1b1174b-3b77-4182-ae71-92ba2e7be833</entry>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <entry name="uuid">e1b1174b-3b77-4182-ae71-92ba2e7be833</entry>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.config"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:89:93:55"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <target dev="tap226d2c95-4b"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/console.log" append="off"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:37:50 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:37:50 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:37:50 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:37:50 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.252 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Preparing to wait for external event network-vif-plugged-226d2c95-4b74-49b7-881e-e404dba21326 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.252 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.252 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.252 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.253 2 DEBUG nova.virt.libvirt.vif [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:37:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ge',id=23,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPPCvh+0/27NPK7QaxbSf6tdbrt2mkjg4O77jk1dvScbusi+fO6V+f49FYOZoIGUQZbEjipPwmFy8iFtu29oLYkaX7Gx2Y5gOhEiJLT+or3V0Du75PLkE3/5tfCC7NsmQ==',key_name='tempest-TestSecurityGroupsBasicOps-517847362',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-3now9jj7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:37:46Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=e1b1174b-3b77-4182-ae71-92ba2e7be833,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.253 2 DEBUG nova.network.os_vif_util [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.254 2 DEBUG nova.network.os_vif_util [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.254 2 DEBUG os_vif [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.255 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.256 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.259 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap226d2c95-4b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap226d2c95-4b, col_values=(('external_ids', {'iface-id': '226d2c95-4b74-49b7-881e-e404dba21326', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:93:55', 'vm-uuid': 'e1b1174b-3b77-4182-ae71-92ba2e7be833'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:50 np0005470441 NetworkManager[51690]: <info>  [1759556270.2628] manager: (tap226d2c95-4b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.270 2 INFO os_vif [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b')#033[00m
Oct  4 01:37:50 np0005470441 podman[224880]: 2025-10-04 05:37:50.313491097 +0000 UTC m=+0.067111259 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:37:50 np0005470441 podman[224881]: 2025-10-04 05:37:50.325972071 +0000 UTC m=+0.062773435 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.340 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.341 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.341 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No VIF found with MAC fa:16:3e:89:93:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.341 2 INFO nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Using config drive#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.763 2 INFO nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Creating config drive at /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.config#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.768 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphem_7voh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.891 2 DEBUG oslo_concurrency.processutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphem_7voh" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:50 np0005470441 kernel: tap226d2c95-4b: entered promiscuous mode
Oct  4 01:37:50 np0005470441 NetworkManager[51690]: <info>  [1759556270.9458] manager: (tap226d2c95-4b): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Oct  4 01:37:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:50Z|00176|binding|INFO|Claiming lport 226d2c95-4b74-49b7-881e-e404dba21326 for this chassis.
Oct  4 01:37:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:50Z|00177|binding|INFO|226d2c95-4b74-49b7-881e-e404dba21326: Claiming fa:16:3e:89:93:55 10.100.0.6
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:50Z|00178|binding|INFO|Setting lport 226d2c95-4b74-49b7-881e-e404dba21326 ovn-installed in OVS
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 nova_compute[192626]: 2025-10-04 05:37:50.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:50 np0005470441 systemd-udevd[224939]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:37:50 np0005470441 systemd-machined[152624]: New machine qemu-12-instance-00000017.
Oct  4 01:37:50 np0005470441 NetworkManager[51690]: <info>  [1759556270.9987] device (tap226d2c95-4b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:37:51 np0005470441 NetworkManager[51690]: <info>  [1759556271.0001] device (tap226d2c95-4b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:37:51 np0005470441 systemd[1]: Started Virtual Machine qemu-12-instance-00000017.
Oct  4 01:37:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:51Z|00179|binding|INFO|Setting lport 226d2c95-4b74-49b7-881e-e404dba21326 up in Southbound
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.058 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:93:55 10.100.0.6'], port_security=['fa:16:3e:89:93:55 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0ec5187-5c01-49fd-b367-066aab190f52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e3bb3dd7-6212-4c62-8755-dee3eaf8206a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b964dfd-53a7-4031-b952-1172eab348bf, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=226d2c95-4b74-49b7-881e-e404dba21326) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.059 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 226d2c95-4b74-49b7-881e-e404dba21326 in datapath d0ec5187-5c01-49fd-b367-066aab190f52 bound to our chassis#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.060 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d0ec5187-5c01-49fd-b367-066aab190f52#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.075 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[14646147-79ac-4da0-a6ba-542d70282dd3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.107 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a435549b-5bd0-416e-924c-3e253a4fb9a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.110 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[b627a5c6-85f6-4a2f-b0aa-8e4d19d9e1d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.136 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[f1513766-d638-4d5e-ad6d-fd4e551b306d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.152 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5434e620-6929-4b94-a783-3d6d1ecd5bc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd0ec5187-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:76:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413526, 'reachable_time': 39012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224954, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.168 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[048fd7ef-fa55-4603-888b-c43bc36ae640]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd0ec5187-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413535, 'tstamp': 413535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224955, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd0ec5187-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413537, 'tstamp': 413537}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224955, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.169 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0ec5187-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.172 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0ec5187-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.172 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.173 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd0ec5187-50, col_values=(('external_ids', {'iface-id': 'd1ccffa3-50a5-4834-b2c0-455085feb4e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:37:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:37:51.173 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:37:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:37:51Z|00180|binding|INFO|Releasing lport d1ccffa3-50a5-4834-b2c0-455085feb4e8 from this chassis (sb_readonly=0)
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.284 2 DEBUG nova.compute.manager [req-14de1e03-3de4-4a18-a200-7e2d270860ac req-783cafa7-76c3-40f6-b68c-840af50419d0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received event network-vif-plugged-226d2c95-4b74-49b7-881e-e404dba21326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.284 2 DEBUG oslo_concurrency.lockutils [req-14de1e03-3de4-4a18-a200-7e2d270860ac req-783cafa7-76c3-40f6-b68c-840af50419d0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.285 2 DEBUG oslo_concurrency.lockutils [req-14de1e03-3de4-4a18-a200-7e2d270860ac req-783cafa7-76c3-40f6-b68c-840af50419d0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.285 2 DEBUG oslo_concurrency.lockutils [req-14de1e03-3de4-4a18-a200-7e2d270860ac req-783cafa7-76c3-40f6-b68c-840af50419d0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.285 2 DEBUG nova.compute.manager [req-14de1e03-3de4-4a18-a200-7e2d270860ac req-783cafa7-76c3-40f6-b68c-840af50419d0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Processing event network-vif-plugged-226d2c95-4b74-49b7-881e-e404dba21326 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.733 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.734 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556271.7326703, e1b1174b-3b77-4182-ae71-92ba2e7be833 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.734 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] VM Started (Lifecycle Event)#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.741 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.745 2 INFO nova.virt.libvirt.driver [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Instance spawned successfully.#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.745 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.769 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.772 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.778 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.779 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.779 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.779 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.779 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.780 2 DEBUG nova.virt.libvirt.driver [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.813 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.813 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556271.732841, e1b1174b-3b77-4182-ae71-92ba2e7be833 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.814 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.837 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.841 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556271.7405875, e1b1174b-3b77-4182-ae71-92ba2e7be833 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.841 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.857 2 INFO nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Took 5.73 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.857 2 DEBUG nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.863 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.866 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.896 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.947 2 INFO nova.compute.manager [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Took 6.25 seconds to build instance.#033[00m
Oct  4 01:37:51 np0005470441 nova_compute[192626]: 2025-10-04 05:37:51.964 2 DEBUG oslo_concurrency.lockutils [None req-2133f2d1-8f06-42a3-9866-5fe097af3f92 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:52 np0005470441 nova_compute[192626]: 2025-10-04 05:37:52.005 2 DEBUG nova.network.neutron [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updated VIF entry in instance network info cache for port 226d2c95-4b74-49b7-881e-e404dba21326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:37:52 np0005470441 nova_compute[192626]: 2025-10-04 05:37:52.006 2 DEBUG nova.network.neutron [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updating instance_info_cache with network_info: [{"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:52 np0005470441 nova_compute[192626]: 2025-10-04 05:37:52.028 2 DEBUG oslo_concurrency.lockutils [req-e510385b-92fb-435a-9c5c-1caaceee30c9 req-93476bfa-3d10-4576-9a7a-3f1f17e732f2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:52 np0005470441 nova_compute[192626]: 2025-10-04 05:37:52.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:53 np0005470441 nova_compute[192626]: 2025-10-04 05:37:53.367 2 DEBUG nova.compute.manager [req-3fe59e65-ec3c-4d4f-bcee-c47bc530958d req-07c8bbc5-090b-411c-ae91-ecdc5091dc02 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received event network-vif-plugged-226d2c95-4b74-49b7-881e-e404dba21326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:53 np0005470441 nova_compute[192626]: 2025-10-04 05:37:53.368 2 DEBUG oslo_concurrency.lockutils [req-3fe59e65-ec3c-4d4f-bcee-c47bc530958d req-07c8bbc5-090b-411c-ae91-ecdc5091dc02 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:53 np0005470441 nova_compute[192626]: 2025-10-04 05:37:53.368 2 DEBUG oslo_concurrency.lockutils [req-3fe59e65-ec3c-4d4f-bcee-c47bc530958d req-07c8bbc5-090b-411c-ae91-ecdc5091dc02 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:53 np0005470441 nova_compute[192626]: 2025-10-04 05:37:53.368 2 DEBUG oslo_concurrency.lockutils [req-3fe59e65-ec3c-4d4f-bcee-c47bc530958d req-07c8bbc5-090b-411c-ae91-ecdc5091dc02 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:53 np0005470441 nova_compute[192626]: 2025-10-04 05:37:53.368 2 DEBUG nova.compute.manager [req-3fe59e65-ec3c-4d4f-bcee-c47bc530958d req-07c8bbc5-090b-411c-ae91-ecdc5091dc02 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] No waiting events found dispatching network-vif-plugged-226d2c95-4b74-49b7-881e-e404dba21326 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:37:53 np0005470441 nova_compute[192626]: 2025-10-04 05:37:53.369 2 WARNING nova.compute.manager [req-3fe59e65-ec3c-4d4f-bcee-c47bc530958d req-07c8bbc5-090b-411c-ae91-ecdc5091dc02 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received unexpected event network-vif-plugged-226d2c95-4b74-49b7-881e-e404dba21326 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.079 2 DEBUG nova.compute.manager [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received event network-changed-226d2c95-4b74-49b7-881e-e404dba21326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.080 2 DEBUG nova.compute.manager [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Refreshing instance network info cache due to event network-changed-226d2c95-4b74-49b7-881e-e404dba21326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.080 2 DEBUG oslo_concurrency.lockutils [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.080 2 DEBUG oslo_concurrency.lockutils [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.080 2 DEBUG nova.network.neutron [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Refreshing network info cache for port 226d2c95-4b74-49b7-881e-e404dba21326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:55 np0005470441 nova_compute[192626]: 2025-10-04 05:37:55.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:56 np0005470441 nova_compute[192626]: 2025-10-04 05:37:56.283 2 DEBUG nova.network.neutron [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updated VIF entry in instance network info cache for port 226d2c95-4b74-49b7-881e-e404dba21326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:37:56 np0005470441 nova_compute[192626]: 2025-10-04 05:37:56.285 2 DEBUG nova.network.neutron [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updating instance_info_cache with network_info: [{"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:56 np0005470441 nova_compute[192626]: 2025-10-04 05:37:56.316 2 DEBUG oslo_concurrency.lockutils [req-24379587-7285-4866-9360-96411aaff644 req-54105386-f919-4752-81c1-7893f67aca7f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:57 np0005470441 nova_compute[192626]: 2025-10-04 05:37:57.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:57 np0005470441 podman[224963]: 2025-10-04 05:37:57.318426011 +0000 UTC m=+0.066333096 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Oct  4 01:37:57 np0005470441 nova_compute[192626]: 2025-10-04 05:37:57.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:37:57 np0005470441 nova_compute[192626]: 2025-10-04 05:37:57.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:58 np0005470441 nova_compute[192626]: 2025-10-04 05:37:58.051 2 DEBUG nova.compute.manager [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received event network-changed-226d2c95-4b74-49b7-881e-e404dba21326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:37:58 np0005470441 nova_compute[192626]: 2025-10-04 05:37:58.051 2 DEBUG nova.compute.manager [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Refreshing instance network info cache due to event network-changed-226d2c95-4b74-49b7-881e-e404dba21326. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:37:58 np0005470441 nova_compute[192626]: 2025-10-04 05:37:58.051 2 DEBUG oslo_concurrency.lockutils [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:37:58 np0005470441 nova_compute[192626]: 2025-10-04 05:37:58.051 2 DEBUG oslo_concurrency.lockutils [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:37:58 np0005470441 nova_compute[192626]: 2025-10-04 05:37:58.052 2 DEBUG nova.network.neutron [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Refreshing network info cache for port 226d2c95-4b74-49b7-881e-e404dba21326 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.275 2 DEBUG nova.network.neutron [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updated VIF entry in instance network info cache for port 226d2c95-4b74-49b7-881e-e404dba21326. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.276 2 DEBUG nova.network.neutron [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updating instance_info_cache with network_info: [{"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.318 2 DEBUG oslo_concurrency.lockutils [req-293f2190-01af-4e2f-8632-95f6ee44ad68 req-51b98f3c-ee01-4668-837e-18027395f8ea 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-e1b1174b-3b77-4182-ae71-92ba2e7be833" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.746 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.746 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.829 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.890 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.891 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.955 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:37:59 np0005470441 nova_compute[192626]: 2025-10-04 05:37:59.960 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.019 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.021 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.078 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.244 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.245 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5470MB free_disk=73.43589782714844GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.246 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.246 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.319 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 63d3b3b4-199b-4587-b7de-ed358aad629f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.321 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance e1b1174b-3b77-4182-ae71-92ba2e7be833 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.322 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.322 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.376 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.395 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.422 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:38:00 np0005470441 nova_compute[192626]: 2025-10-04 05:38:00.422 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:01 np0005470441 podman[224998]: 2025-10-04 05:38:01.29537514 +0000 UTC m=+0.051370421 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:38:01 np0005470441 nova_compute[192626]: 2025-10-04 05:38:01.423 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:01 np0005470441 nova_compute[192626]: 2025-10-04 05:38:01.424 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:38:01 np0005470441 nova_compute[192626]: 2025-10-04 05:38:01.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:01 np0005470441 nova_compute[192626]: 2025-10-04 05:38:01.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:38:01 np0005470441 nova_compute[192626]: 2025-10-04 05:38:01.719 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:38:02 np0005470441 nova_compute[192626]: 2025-10-04 05:38:02.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.710 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000017', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'hostId': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.712 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000015', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'hostId': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.713 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.716 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for e1b1174b-3b77-4182-ae71-92ba2e7be833 / tap226d2c95-4b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.716 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.721 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 63d3b3b4-199b-4587-b7de-ed358aad629f / tap1727c984-f9 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.722 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.incoming.packets volume: 119 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f23eb3f-1056-453d-8f67-9619591984c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.713292', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b711098-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': 'd78697f499e56bb115501e1bcebdb948c94ec72c77844dddbe6ace204b8dc40e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 119, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.713292', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b71d320-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': '1ee66a442692883b3cfac6a089ac7eb9cd88df45b53e08d999ed8aa97cdb6154'}]}, 'timestamp': '2025-10-04 05:38:02.722724', '_unique_id': '6fec47e181c84e3fa76aab6087633757'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.724 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.726 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.743 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/cpu volume: 10380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.760 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/cpu volume: 11970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae128131-ce09-48fb-8206-f5edffbd9ed1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10380000000, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'timestamp': '2025-10-04T05:38:02.726486', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4b751fe4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.458126884, 'message_signature': '5474a556bf31a19a6a9925f70809b7b9224285d1403cdba786182058f086cd91'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11970000000, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'timestamp': '2025-10-04T05:38:02.726486', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '4b77a8e0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.474630113, 'message_signature': '5fb10fe5d97e0c5c427836413236661ae722aab72ead8b7060f04cbb39baa43f'}]}, 'timestamp': '2025-10-04 05:38:02.760844', '_unique_id': '97182475376d4febabc92a2c41f0b85a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.762 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.763 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.763 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.763 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3230d9f6-1b30-4475-b924-0b848828d7d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.763231', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b781672-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': '95b604cf1ac6f7e50ce6d9a9ab15e8f51c3ed9b87292d774a24650c7d201ef31'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.763231', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b7824fa-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': 'e2b2aa7411d213dcf13e948fe132126cc6984434d03f908af1260e30c39b0fad'}]}, 'timestamp': '2025-10-04 05:38:02.763957', '_unique_id': 'cb20ff32ea8c428d91da0e7b9bcdea2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.764 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.765 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.765 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.766 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>]
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.766 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.778 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.778 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.790 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.791 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cadea6e-281b-45e6-bdbf-b831645dc856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.766447', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b7a6aee-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.481030725, 'message_signature': '033fecc4d561adb1a0a07b6e0d779b72a1dd5d3ba9869a4693e2a5f38a546adc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.766447', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b7a7908-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.481030725, 'message_signature': '0366377c9c15eb4e07cc24d78970a17c1f0a8f1e2ea8b95e4a46ea50beca9089'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.766447', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b7c4cd8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.493745347, 'message_signature': 'f21a39d8c1b3b479fa42e6d0d2c4fcefd4f03a5309f4340d646fd589ebbab1ea'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.766447', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b7c5b42-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.493745347, 'message_signature': '28979fb600fd24998fee6788c9ce1bcd544bb42d62f6c8268a3ee9592d0a4842'}]}, 'timestamp': '2025-10-04 05:38:02.791666', '_unique_id': 'dbb35734f33541399a682d476743a6a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.792 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.794 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.794 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>]
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.794 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.794 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.794 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>]
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.795 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.795 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.795 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0d3b2f1-f9d6-4ad5-b4b2-bc7da100a951', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.795180', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b7cf660-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': '66981ab3380cfdc7920de50f7a31c60b60af43c0f251afa9982da21ecb5c432d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.795180', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b7d0588-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': 'a2386e3ce80cc5e231f0343cba369ebad742601eb871029d72b695dd643b5ad6'}]}, 'timestamp': '2025-10-04 05:38:02.795944', '_unique_id': '69d93c3aa5094223ab5413924c39b1a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.796 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.797 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.797 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.798 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf56a878-084b-4e24-afe5-1783c051270d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.797958', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b7d6262-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': '9ff9b4361ad440166be278d77eed9b5d4db885572bb674c94e67e1c8f76cbd31'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.797958', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b7d72d4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': '337b59806d02af9a75df2654b87814dc6a65d36a03de7ee04821ad9e58fe39ec'}]}, 'timestamp': '2025-10-04 05:38:02.798721', '_unique_id': '7470c090117e46d0bb39f7b0650c4098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.799 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.800 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.800 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727>]
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.801 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.822 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.read.latency volume: 410914529 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.823 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.read.latency volume: 697310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.841 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.read.latency volume: 495179151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.841 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.read.latency volume: 134509210 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7d681de-f2c0-4bc4-b5d8-ec65f05acaa9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 410914529, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.801195', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b813298-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '53cc32e269ef75cb971173208b93f55d0bfaffb716d916cade252c486c9b1396'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 697310, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.801195', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b8140d0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '6790bf6b6a66a5d5e67a10a9f9bc123f5fd4c26c389eed00aac9189f2f9b3dd9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 495179151, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.801195', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b840130-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': 'd782c4115c1d626b9f082b977aab2937ca17d8f6d8f64e9e3d71d6a4f1f95465'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 134509210, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.801195', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b840c20-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': 'bde5c30f09e8d64eb368edcd4f544e4e28fc0e3fb78390e85cf4a8e751ca2a2c'}]}, 'timestamp': '2025-10-04 05:38:02.841919', '_unique_id': '16c92d1a0edb414f9b771109ab894fe2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.843 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.843 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/memory.usage volume: 40.421875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.843 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/memory.usage volume: 42.7734375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ed36ef6-48a0-4d98-85d8-058170d98298', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.421875, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'timestamp': '2025-10-04T05:38:02.843713', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4b845bb2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.458126884, 'message_signature': '324456ca03b0491263cd61e1084cf32ed3359a8e68b94e2cc1d855f76c5b925c'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7734375, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'timestamp': '2025-10-04T05:38:02.843713', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4b8463fa-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.474630113, 'message_signature': 'ca8abf2bd7aeb61aae6a29b4fd72e5768453ae9fce890f07c39eeedc8705b427'}]}, 'timestamp': '2025-10-04 05:38:02.844154', '_unique_id': 'ba1756f9316e4594a8bd8fee75648364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.845 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.845 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.845 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bd19049-318b-4a0c-87e2-d310fcb2ffae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.845354', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b849c58-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': 'e32fb845403a9ac9d9d911cffe589ecbc25007a8007e0ca776c9f5a5747e57c0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.845354', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b84a630-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': '2f1fb6ea0f4dd7ae696c4e6f63c8ac527a513faa9ffccec472e6a72b728767a5'}]}, 'timestamp': '2025-10-04 05:38:02.845856', '_unique_id': '693ab0be44d24d3e9a8e18be2854d0cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.846 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.847 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.847 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.847 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '533d0eab-1933-4250-811c-9d702b906171', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.846974', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b84da9c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.481030725, 'message_signature': '06d84f9cec1c826fb16dd32744e2a4967653233fc9e03282cb6cf08ad2e8de5c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.846974', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b84e2d0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.481030725, 'message_signature': 'a567a43cb964973ea3a3b350cb0b074d996b7a0090b6921701f601232930383f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.846974', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b84ebe0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.493745347, 'message_signature': '63881518991a19039d919bd64942c4e1722fac27ca7441ce0b5d3fcb7056e2ed'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.846974', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b84f388-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.493745347, 'message_signature': '8a2f0167f243d2d71913c0b1b020bcbe94308063af44e9c874e9ab261cdfdb9c'}]}, 'timestamp': '2025-10-04 05:38:02.847826', '_unique_id': '8eb2a35efd6c4bbcb991f9605dd0a9b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.848 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.849 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.849 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.849 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.850 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.read.requests volume: 1117 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.850 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1e42a72-5428-49bc-94be-a91e64d5772d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.849138', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b852f24-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '0e7edfa05b54f5661b2dbbb5959185f6129a3061275b92aab7deaff44c87c5bc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.849138', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b854400-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '570ce511b657f747dc29d2eb5f3ccb4ebd4bff4531973a9e85c0b6bfe399574c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1117, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.849138', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b855eae-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '0872d64e1441a4e47eeedb78bfdc9dcaa226f6f23bcff53d4df30f089948b82f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.849138', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b8577a4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '350d11e143b0337a8de76dd58238c692d24d2ae969c1390227f576281b7ea311'}]}, 'timestamp': '2025-10-04 05:38:02.851469', '_unique_id': 'cdbb901a90f9483495aef6cadc6ec44b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.853 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.854 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.854 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.write.requests volume: 342 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.854 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24577fe7-a2ea-474e-ab5b-78f04dfda708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.853911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b85e982-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': 'adbf20f8fede9cf62f05157702cb9fd18174e900a4d068799679e83f5aca333e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.853911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b85f148-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '4f817856f634c27722bf8801e1ac1bc738805da44096f37bd45c1da25c796f9e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 342, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.853911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b85f990-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': 'da0484187e7a2b68c378299927d7afbedb876c2f0003f090d690b235d2dd82a9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.853911', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b860480-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '51df58a1fa8d390317b80b7a1af5b0aee08df4b53263d9f86b259dbb39b2a456'}]}, 'timestamp': '2025-10-04 05:38:02.854850', '_unique_id': 'be02f97e5246452683cc8d010eb7f5b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.855 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.856 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.856 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.856 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.856 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.856 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3adfec3-823f-4a92-b0d8-22f3684f89f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.856157', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b864166-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.481030725, 'message_signature': 'fad814850a9aa49ab899cbe020ed8c4fff54c022952bb6dee22e1735e0b80cee'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.856157', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b864c6a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.481030725, 'message_signature': '8cf2e9fe926ed3b0ec4fa0b6d557ce337ba9ec8dd770cc862879ec863f792f63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.856157', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b86553e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.493745347, 'message_signature': 'fe076cd228762f36843004c4b732d66cf995c5a7c9d118378d86abe8e4e17491'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.856157', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b865cc8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.493745347, 'message_signature': '1a306bfc3ef4d3d558a38ce616420b7a50248fe45ab74ba3e8215aa48fcdbcb0'}]}, 'timestamp': '2025-10-04 05:38:02.857074', '_unique_id': '131e552b7a194771928f1226e5b4aa66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.857 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.858 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.858 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.858 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52178137-4552-402b-93ee-ec409f359c0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.858246', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b869332-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': 'faf971de4e683a4a1bfd086bccd5d0df77c56db1776eb5cd7b28118a35a91182'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.858246', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b869dc8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': '6bc33d3efb509e48c7a613a2ed3602ba46f4a4eb04d456adb50cfd23d6bd4014'}]}, 'timestamp': '2025-10-04 05:38:02.858750', '_unique_id': '432a4a77163f42fb90d56898d67f6b22'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.859 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a261502-24d6-4f7a-9a1a-619da279edad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.859963', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b86d68a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': '6756217a3bec5d45436f081a5688159fe0e8c83bb014efd8c7eb3884ebe10e75'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.859963', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b86defa-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': '1e9fc410578f327e5fcbf6418b8542f2717731d34c49c7e6d61a82a8dbb4b3ab'}]}, 'timestamp': '2025-10-04 05:38:02.860415', '_unique_id': 'ff9b7473a2214394a4bc24a1f94fa12d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.860 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.861 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.861 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.861 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.outgoing.bytes volume: 18398 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6032fba2-7b23-4964-b5d8-423af7213353', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.861645', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b87178a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': 'ca04df6a5b3908cebe03e120568227dea8b9937bcb5761ae85727e23d6327370'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18398, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.861645', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b871fc8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': 'edc5802c8a57bd561ea21d90636d5a5ce0d69c4225d239f005ca3c7fcffd560f'}]}, 'timestamp': '2025-10-04 05:38:02.862074', '_unique_id': '9aca511d242d4d8cb432e34c02f4de0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.862 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.863 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.863 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.863 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.incoming.bytes volume: 22259 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a108c4e2-8ed8-41e4-bf06-65ea8ff13d46', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.863290', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b87584e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': '027c80e247beb30ebc08eb7cc912e01b9c4b823b9330765cb7b3e6d4163d1649'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 22259, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.863290', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b8761f4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': '2eff4f5d9a98a9f7a09d5ac8d0471a3aef021977193ec24918fd9520681ddb8c'}]}, 'timestamp': '2025-10-04 05:38:02.863768', '_unique_id': '37deb69c1d24476bb500970752dfe28a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.864 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.865 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.865 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.read.bytes volume: 31037952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.865 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4a50e1e-31f4-4872-8444-52fa0fdd2ab9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.864886', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b879624-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '1a461f455765d5d9e8dff0d762881c5d4ddd49f098c6e5223f41963c1edf657d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.864886', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b879e76-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '4c67efba65c263af712d1878f0915f8c162a364fcb38eb6d1ba57044dad759c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31037952, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.864886', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b87a7e0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '93577702bc5fd85b0c3925029644a338bff9dd4d6aadcedef6083fd5745ac99e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.864886', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b87b2da-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': 'f7ce4b7459c25c716ed22608e14c8eb1b140facbfe2c8481721e34368b0c59c6'}]}, 'timestamp': '2025-10-04 05:38:02.865834', '_unique_id': '23a0b3e2e77240dcbab900f358b1cbee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.866 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69b893a6-6f73-4ab4-b29e-9fb2b634e1e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000017-e1b1174b-3b77-4182-ae71-92ba2e7be833-tap226d2c95-4b', 'timestamp': '2025-10-04T05:38:02.866966', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'tap226d2c95-4b', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:89:93:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap226d2c95-4b'}, 'message_id': '4b87e76e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.427825113, 'message_signature': '328f15484fca4d16d3f0c60845b8e01420ba0c9df7efd97eaef4ff5e818301b0'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'instance-00000015-63d3b3b4-199b-4587-b7de-ed358aad629f-tap1727c984-f9', 'timestamp': '2025-10-04T05:38:02.866966', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'tap1727c984-f9', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b2:e2:f4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1727c984-f9'}, 'message_id': '4b87efac-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.43230991, 'message_signature': 'ba2fab1d853d00aa41212c2338bf09c1dbb45b22a716902fde2805439ff537d4'}]}, 'timestamp': '2025-10-04 05:38:02.867416', '_unique_id': '886458e5b5fe4ef0b2da6008e573e433'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.867 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.868 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.868 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.868 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.868 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.write.bytes volume: 72994816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69c6080b-79df-4840-9a19-a5b814bcfcb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.868498', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b88247c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '1a15ae2feeea74e5597bd01a594dab9a315ccc785310085b2c4b31cc9008e411'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.868498', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b882e18-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '630d2262c049d42eb669e79a4226b3cd29c13f040e95bcf991dfa1417541192a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72994816, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.868498', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b8835de-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '6a8447a3b743e8d000515c77f6ef4641be6c68aef7cfe007854c4b2047ae31e5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.868498', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b883d5e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '81a1eb430ec30d62034355081789f0488ddcb410936c0a4b49e59d0b57e71fe4'}]}, 'timestamp': '2025-10-04 05:38:02.869374', '_unique_id': 'bdb3ff0674ab4809acd01c5cc330c7b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.869 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.870 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.870 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.870 12 DEBUG ceilometer.compute.pollsters [-] e1b1174b-3b77-4182-ae71-92ba2e7be833/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.871 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.write.latency volume: 2052881685 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.871 12 DEBUG ceilometer.compute.pollsters [-] 63d3b3b4-199b-4587-b7de-ed358aad629f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e3f7649-c068-4f13-bab7-c89bb985d393', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-vda', 'timestamp': '2025-10-04T05:38:02.870570', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b8875a8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '3ced74ff023aeb15adf58960ededbb7ee6f7740cb5961f01871b3a4bd5c3a1ec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833-sda', 'timestamp': '2025-10-04T05:38:02.870570', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671', 'name': 'instance-00000017', 'instance_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b887f1c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.515760652, 'message_signature': '91ce268ddb52aa3c92a5a4d3ff621f47aa9e2c8b1e292f8794d1efdc47f9d2ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2052881685, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-vda', 'timestamp': '2025-10-04T05:38:02.870570', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4b8886c4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '6f4b7278f5c84d04be070c0d7c6877bd1718f2dc77a9b1aa2d9fb8083afcf21d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_name': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_name': None, 'resource_id': '63d3b3b4-199b-4587-b7de-ed358aad629f-sda', 'timestamp': '2025-10-04T05:38:02.870570', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727', 'name': 'instance-00000015', 'instance_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'instance_type': 'm1.nano', 'host': 'ccb4c7537197b5f516caa902ef1018c6b457d09b84a674bf54f95ee6', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4b889114-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4180.538248611, 'message_signature': '39d89d561fc376ec8c58d073c29aa9cb2a964efd7511bfde27cd28d40602ba9d'}]}, 'timestamp': '2025-10-04 05:38:02.871591', '_unique_id': '97718368169c4919968a801f113d31d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:38:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:38:02.872 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:38:02 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:38:03 np0005470441 podman[225044]: 2025-10-04 05:38:03.302322494 +0000 UTC m=+0.051681610 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  4 01:38:03 np0005470441 nova_compute[192626]: 2025-10-04 05:38:03.607 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:38:03 np0005470441 nova_compute[192626]: 2025-10-04 05:38:03.607 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:38:03 np0005470441 nova_compute[192626]: 2025-10-04 05:38:03.607 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:38:03 np0005470441 nova_compute[192626]: 2025-10-04 05:38:03.607 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 63d3b3b4-199b-4587-b7de-ed358aad629f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:38:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:03Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:93:55 10.100.0.6
Oct  4 01:38:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:03Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:93:55 10.100.0.6
Oct  4 01:38:04 np0005470441 nova_compute[192626]: 2025-10-04 05:38:04.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:05 np0005470441 nova_compute[192626]: 2025-10-04 05:38:05.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:06.745 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:06.745 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:06.746 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:07 np0005470441 podman[225064]: 2025-10-04 05:38:07.335562343 +0000 UTC m=+0.083476314 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Oct  4 01:38:07 np0005470441 nova_compute[192626]: 2025-10-04 05:38:07.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.712 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updating instance_info_cache with network_info: [{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.742 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.743 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.743 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:08 np0005470441 nova_compute[192626]: 2025-10-04 05:38:08.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.702 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "e1b1174b-3b77-4182-ae71-92ba2e7be833" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.703 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.704 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.704 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.704 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.706 2 INFO nova.compute.manager [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Terminating instance#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.706 2 DEBUG nova.compute.manager [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:38:10 np0005470441 kernel: tap226d2c95-4b (unregistering): left promiscuous mode
Oct  4 01:38:10 np0005470441 NetworkManager[51690]: <info>  [1759556290.7389] device (tap226d2c95-4b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:38:10 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:10Z|00181|binding|INFO|Releasing lport 226d2c95-4b74-49b7-881e-e404dba21326 from this chassis (sb_readonly=0)
Oct  4 01:38:10 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:10Z|00182|binding|INFO|Setting lport 226d2c95-4b74-49b7-881e-e404dba21326 down in Southbound
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:10Z|00183|binding|INFO|Removing iface tap226d2c95-4b ovn-installed in OVS
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.762 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:93:55 10.100.0.6', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e1b1174b-3b77-4182-ae71-92ba2e7be833', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0ec5187-5c01-49fd-b367-066aab190f52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b964dfd-53a7-4031-b952-1172eab348bf, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=226d2c95-4b74-49b7-881e-e404dba21326) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.764 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 226d2c95-4b74-49b7-881e-e404dba21326 in datapath d0ec5187-5c01-49fd-b367-066aab190f52 unbound from our chassis#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.766 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d0ec5187-5c01-49fd-b367-066aab190f52#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.782 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c95040ce-7111-4099-8ec8-a59c3f602821]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:10 np0005470441 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Deactivated successfully.
Oct  4 01:38:10 np0005470441 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000017.scope: Consumed 12.956s CPU time.
Oct  4 01:38:10 np0005470441 systemd-machined[152624]: Machine qemu-12-instance-00000017 terminated.
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.816 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[325d0801-757d-4b72-96f9-3d0a7123ba5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.818 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[e46d55eb-6ded-41ff-9b5f-04bf24652c4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.848 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3ee541-f6f9-4f09-9338-df3fc66a7ddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.866 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8c03e75b-97a2-4979-afe3-2fbd4d9f5586]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd0ec5187-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:76:3b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1670, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 21, 'tx_packets': 7, 'rx_bytes': 1670, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413526, 'reachable_time': 39012, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 15, 'inoctets': 1208, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 15, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1208, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 15, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225102, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.885 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bc19a23a-e002-4a73-8be0-d297a5247e19]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd0ec5187-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413535, 'tstamp': 413535}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225103, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd0ec5187-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 413537, 'tstamp': 413537}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225103, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.887 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0ec5187-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.934 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd0ec5187-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.934 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.934 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd0ec5187-50, col_values=(('external_ids', {'iface-id': 'd1ccffa3-50a5-4834-b2c0-455085feb4e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:10.935 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.965 2 INFO nova.virt.libvirt.driver [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Instance destroyed successfully.#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.966 2 DEBUG nova.objects.instance [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'resources' on Instance uuid e1b1174b-3b77-4182-ae71-92ba2e7be833 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.982 2 DEBUG nova.virt.libvirt.vif [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:37:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-gen-1-618653671',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ge',id=23,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPPCvh+0/27NPK7QaxbSf6tdbrt2mkjg4O77jk1dvScbusi+fO6V+f49FYOZoIGUQZbEjipPwmFy8iFtu29oLYkaX7Gx2Y5gOhEiJLT+or3V0Du75PLkE3/5tfCC7NsmQ==',key_name='tempest-TestSecurityGroupsBasicOps-517847362',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:37:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-3now9jj7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:37:51Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=e1b1174b-3b77-4182-ae71-92ba2e7be833,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.983 2 DEBUG nova.network.os_vif_util [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "226d2c95-4b74-49b7-881e-e404dba21326", "address": "fa:16:3e:89:93:55", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap226d2c95-4b", "ovs_interfaceid": "226d2c95-4b74-49b7-881e-e404dba21326", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.984 2 DEBUG nova.network.os_vif_util [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.984 2 DEBUG os_vif [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.986 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap226d2c95-4b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.991 2 INFO os_vif [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:93:55,bridge_name='br-int',has_traffic_filtering=True,id=226d2c95-4b74-49b7-881e-e404dba21326,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap226d2c95-4b')#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.992 2 INFO nova.virt.libvirt.driver [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Deleting instance files /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833_del#033[00m
Oct  4 01:38:10 np0005470441 nova_compute[192626]: 2025-10-04 05:38:10.992 2 INFO nova.virt.libvirt.driver [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Deletion of /var/lib/nova/instances/e1b1174b-3b77-4182-ae71-92ba2e7be833_del complete#033[00m
Oct  4 01:38:11 np0005470441 nova_compute[192626]: 2025-10-04 05:38:11.050 2 INFO nova.compute.manager [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:38:11 np0005470441 nova_compute[192626]: 2025-10-04 05:38:11.051 2 DEBUG oslo.service.loopingcall [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:38:11 np0005470441 nova_compute[192626]: 2025-10-04 05:38:11.051 2 DEBUG nova.compute.manager [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:38:11 np0005470441 nova_compute[192626]: 2025-10-04 05:38:11.052 2 DEBUG nova.network.neutron [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:38:12 np0005470441 nova_compute[192626]: 2025-10-04 05:38:12.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:12 np0005470441 nova_compute[192626]: 2025-10-04 05:38:12.968 2 DEBUG nova.network.neutron [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:38:12 np0005470441 nova_compute[192626]: 2025-10-04 05:38:12.992 2 INFO nova.compute.manager [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Took 1.94 seconds to deallocate network for instance.#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.033 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.033 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.117 2 DEBUG nova.compute.provider_tree [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.129 2 DEBUG nova.compute.manager [req-ee64b5d9-d243-47c0-b94f-473a10f7bae0 req-b06fbe50-78bb-46ee-8eda-9a2be6c52add 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Received event network-vif-deleted-226d2c95-4b74-49b7-881e-e404dba21326 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.135 2 DEBUG nova.scheduler.client.report [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.158 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.183 2 INFO nova.scheduler.client.report [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Deleted allocations for instance e1b1174b-3b77-4182-ae71-92ba2e7be833#033[00m
Oct  4 01:38:13 np0005470441 nova_compute[192626]: 2025-10-04 05:38:13.253 2 DEBUG oslo_concurrency.lockutils [None req-f1ed109e-113d-47a9-ad4f-e762f242be5c 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "e1b1174b-3b77-4182-ae71-92ba2e7be833" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:15 np0005470441 nova_compute[192626]: 2025-10-04 05:38:15.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.322 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.323 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.323 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.324 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.324 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.326 2 INFO nova.compute.manager [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Terminating instance#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.328 2 DEBUG nova.compute.manager [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:38:16 np0005470441 kernel: tap1727c984-f9 (unregistering): left promiscuous mode
Oct  4 01:38:16 np0005470441 NetworkManager[51690]: <info>  [1759556296.3599] device (tap1727c984-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00184|binding|INFO|Releasing lport 1727c984-f918-4a8a-880e-628b50e8dc5e from this chassis (sb_readonly=0)
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00185|binding|INFO|Setting lport 1727c984-f918-4a8a-880e-628b50e8dc5e down in Southbound
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00186|binding|INFO|Removing iface tap1727c984-f9 ovn-installed in OVS
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.384 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:e2:f4 10.100.0.11'], port_security=['fa:16:3e:b2:e2:f4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0ec5187-5c01-49fd-b367-066aab190f52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7972d4d7-da89-4cf4-80f9-17e8ab47d731 e3bb3dd7-6212-4c62-8755-dee3eaf8206a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b964dfd-53a7-4031-b952-1172eab348bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1727c984-f918-4a8a-880e-628b50e8dc5e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.385 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1727c984-f918-4a8a-880e-628b50e8dc5e in datapath d0ec5187-5c01-49fd-b367-066aab190f52 unbound from our chassis#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.388 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d0ec5187-5c01-49fd-b367-066aab190f52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.389 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3e79c85a-eef0-4e1a-b66c-3ce4fd5e7839]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.396 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52 namespace which is not needed anymore#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct  4 01:38:16 np0005470441 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Consumed 15.212s CPU time.
Oct  4 01:38:16 np0005470441 systemd-machined[152624]: Machine qemu-11-instance-00000015 terminated.
Oct  4 01:38:16 np0005470441 podman[225125]: 2025-10-04 05:38:16.462128573 +0000 UTC m=+0.059861313 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:38:16 np0005470441 podman[225123]: 2025-10-04 05:38:16.464799619 +0000 UTC m=+0.065742610 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  4 01:38:16 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [NOTICE]   (224659) : haproxy version is 2.8.14-c23fe91
Oct  4 01:38:16 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [NOTICE]   (224659) : path to executable is /usr/sbin/haproxy
Oct  4 01:38:16 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [WARNING]  (224659) : Exiting Master process...
Oct  4 01:38:16 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [ALERT]    (224659) : Current worker (224661) exited with code 143 (Terminated)
Oct  4 01:38:16 np0005470441 neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52[224655]: [WARNING]  (224659) : All workers exited. Exiting... (0)
Oct  4 01:38:16 np0005470441 systemd[1]: libpod-7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac.scope: Deactivated successfully.
Oct  4 01:38:16 np0005470441 podman[225184]: 2025-10-04 05:38:16.53097973 +0000 UTC m=+0.045681700 container died 7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:38:16 np0005470441 kernel: tap1727c984-f9: entered promiscuous mode
Oct  4 01:38:16 np0005470441 kernel: tap1727c984-f9 (unregistering): left promiscuous mode
Oct  4 01:38:16 np0005470441 NetworkManager[51690]: <info>  [1759556296.5470] manager: (tap1727c984-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Oct  4 01:38:16 np0005470441 systemd-udevd[225138]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00187|binding|INFO|Claiming lport 1727c984-f918-4a8a-880e-628b50e8dc5e for this chassis.
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00188|binding|INFO|1727c984-f918-4a8a-880e-628b50e8dc5e: Claiming fa:16:3e:b2:e2:f4 10.100.0.11
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.559 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:e2:f4 10.100.0.11'], port_security=['fa:16:3e:b2:e2:f4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0ec5187-5c01-49fd-b367-066aab190f52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7972d4d7-da89-4cf4-80f9-17e8ab47d731 e3bb3dd7-6212-4c62-8755-dee3eaf8206a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b964dfd-53a7-4031-b952-1172eab348bf, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1727c984-f918-4a8a-880e-628b50e8dc5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00189|binding|INFO|Setting lport 1727c984-f918-4a8a-880e-628b50e8dc5e ovn-installed in OVS
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00190|binding|INFO|Setting lport 1727c984-f918-4a8a-880e-628b50e8dc5e up in Southbound
Oct  4 01:38:16 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac-userdata-shm.mount: Deactivated successfully.
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 systemd[1]: var-lib-containers-storage-overlay-9e393fbab0af4df3b297b478223af3cb3e0b48db512a189d8bf531fa3bf8fd81-merged.mount: Deactivated successfully.
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00191|binding|INFO|Releasing lport 1727c984-f918-4a8a-880e-628b50e8dc5e from this chassis (sb_readonly=0)
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00192|binding|INFO|Setting lport 1727c984-f918-4a8a-880e-628b50e8dc5e down in Southbound
Oct  4 01:38:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:16Z|00193|binding|INFO|Removing iface tap1727c984-f9 ovn-installed in OVS
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 podman[225184]: 2025-10-04 05:38:16.578990944 +0000 UTC m=+0.093692914 container cleanup 7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.587 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:e2:f4 10.100.0.11'], port_security=['fa:16:3e:b2:e2:f4 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '63d3b3b4-199b-4587-b7de-ed358aad629f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d0ec5187-5c01-49fd-b367-066aab190f52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7972d4d7-da89-4cf4-80f9-17e8ab47d731 e3bb3dd7-6212-4c62-8755-dee3eaf8206a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b964dfd-53a7-4031-b952-1172eab348bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1727c984-f918-4a8a-880e-628b50e8dc5e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:38:16 np0005470441 systemd[1]: libpod-conmon-7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac.scope: Deactivated successfully.
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.599 2 INFO nova.virt.libvirt.driver [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Instance destroyed successfully.#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.600 2 DEBUG nova.objects.instance [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'resources' on Instance uuid 63d3b3b4-199b-4587-b7de-ed358aad629f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.615 2 DEBUG nova.virt.libvirt.vif [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:37:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-1688505727',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=21,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEPPCvh+0/27NPK7QaxbSf6tdbrt2mkjg4O77jk1dvScbusi+fO6V+f49FYOZoIGUQZbEjipPwmFy8iFtu29oLYkaX7Gx2Y5gOhEiJLT+or3V0Du75PLkE3/5tfCC7NsmQ==',key_name='tempest-TestSecurityGroupsBasicOps-517847362',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:37:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-6ef3w186',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:37:18Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=63d3b3b4-199b-4587-b7de-ed358aad629f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.616 2 DEBUG nova.network.os_vif_util [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.616 2 DEBUG nova.network.os_vif_util [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.617 2 DEBUG os_vif [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.618 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1727c984-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:16 np0005470441 podman[225224]: 2025-10-04 05:38:16.652279877 +0000 UTC m=+0.041880721 container remove 7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0)
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.657 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee8138e-7afe-4265-ac75-e37ebdad8180]: (4, ('Sat Oct  4 05:38:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52 (7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac)\n7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac\nSat Oct  4 05:38:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52 (7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac)\n7326a7fa945fbac00a3075123c445b76850e0980bef7214e98b6e331207825ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.658 2 INFO os_vif [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b2:e2:f4,bridge_name='br-int',has_traffic_filtering=True,id=1727c984-f918-4a8a-880e-628b50e8dc5e,network=Network(d0ec5187-5c01-49fd-b367-066aab190f52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1727c984-f9')#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.658 2 INFO nova.virt.libvirt.driver [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Deleting instance files /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f_del#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.658 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[196310f6-0fb4-4cc5-b147-e661a3e9debb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.659 2 INFO nova.virt.libvirt.driver [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Deletion of /var/lib/nova/instances/63d3b3b4-199b-4587-b7de-ed358aad629f_del complete#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.659 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd0ec5187-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:16 np0005470441 kernel: tapd0ec5187-50: left promiscuous mode
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.675 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5c42bcc0-cef6-4e27-9479-f6d6ae131098]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.693 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c58fe0d6-1eb5-47b5-b761-7a0164c0a54f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.694 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[85c98e80-f6cd-4b9d-94eb-c93df2a9df2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.706 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[506abb35-f79e-477b-8f5f-88f7950356ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 413519, 'reachable_time': 18983, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225239, 'error': None, 'target': 'ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.708 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d0ec5187-5c01-49fd-b367-066aab190f52 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.708 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[0d26d421-b42e-4333-bf1c-69355ae3b955]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.709 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1727c984-f918-4a8a-880e-628b50e8dc5e in datapath d0ec5187-5c01-49fd-b367-066aab190f52 unbound from our chassis#033[00m
Oct  4 01:38:16 np0005470441 systemd[1]: run-netns-ovnmeta\x2dd0ec5187\x2d5c01\x2d49fd\x2db367\x2d066aab190f52.mount: Deactivated successfully.
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.710 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d0ec5187-5c01-49fd-b367-066aab190f52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.711 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e0588d9e-ad6b-4217-9398-769ef647563d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.711 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1727c984-f918-4a8a-880e-628b50e8dc5e in datapath d0ec5187-5c01-49fd-b367-066aab190f52 unbound from our chassis#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.712 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d0ec5187-5c01-49fd-b367-066aab190f52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:38:16 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:16.713 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ea4181db-1c73-4013-b5fa-aae197baecf6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.717 2 INFO nova.compute.manager [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.718 2 DEBUG oslo.service.loopingcall [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.718 2 DEBUG nova.compute.manager [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:38:16 np0005470441 nova_compute[192626]: 2025-10-04 05:38:16.719 2 DEBUG nova.network.neutron [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:38:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:17.108 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:17.109 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.391 2 DEBUG nova.compute.manager [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-changed-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.392 2 DEBUG nova.compute.manager [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Refreshing instance network info cache due to event network-changed-1727c984-f918-4a8a-880e-628b50e8dc5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.392 2 DEBUG oslo_concurrency.lockutils [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.392 2 DEBUG oslo_concurrency.lockutils [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.393 2 DEBUG nova.network.neutron [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Refreshing network info cache for port 1727c984-f918-4a8a-880e-628b50e8dc5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:38:17 np0005470441 nova_compute[192626]: 2025-10-04 05:38:17.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.470 2 DEBUG nova.network.neutron [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.495 2 INFO nova.compute.manager [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Took 1.78 seconds to deallocate network for instance.#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.569 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.570 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.718 2 DEBUG nova.compute.provider_tree [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.749 2 DEBUG nova.scheduler.client.report [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.784 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.816 2 INFO nova.scheduler.client.report [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Deleted allocations for instance 63d3b3b4-199b-4587-b7de-ed358aad629f#033[00m
Oct  4 01:38:18 np0005470441 nova_compute[192626]: 2025-10-04 05:38:18.927 2 DEBUG oslo_concurrency.lockutils [None req-e218943c-ddc2-42d7-9734-958b819295cf 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:19 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:19.111 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.331 2 DEBUG nova.network.neutron [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updated VIF entry in instance network info cache for port 1727c984-f918-4a8a-880e-628b50e8dc5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.332 2 DEBUG nova.network.neutron [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Updating instance_info_cache with network_info: [{"id": "1727c984-f918-4a8a-880e-628b50e8dc5e", "address": "fa:16:3e:b2:e2:f4", "network": {"id": "d0ec5187-5c01-49fd-b367-066aab190f52", "bridge": "br-int", "label": "tempest-network-smoke--749826888", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1727c984-f9", "ovs_interfaceid": "1727c984-f918-4a8a-880e-628b50e8dc5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.365 2 DEBUG oslo_concurrency.lockutils [req-eadf7573-e60c-4e07-8876-58957bbd0a31 req-1fbea995-331c-49b1-b7cd-f89359260d2f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-63d3b3b4-199b-4587-b7de-ed358aad629f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.565 2 DEBUG nova.compute.manager [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-vif-unplugged-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.566 2 DEBUG oslo_concurrency.lockutils [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.566 2 DEBUG oslo_concurrency.lockutils [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.566 2 DEBUG oslo_concurrency.lockutils [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.567 2 DEBUG nova.compute.manager [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] No waiting events found dispatching network-vif-unplugged-1727c984-f918-4a8a-880e-628b50e8dc5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.567 2 WARNING nova.compute.manager [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received unexpected event network-vif-unplugged-1727c984-f918-4a8a-880e-628b50e8dc5e for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.567 2 DEBUG nova.compute.manager [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.568 2 DEBUG oslo_concurrency.lockutils [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.568 2 DEBUG oslo_concurrency.lockutils [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.568 2 DEBUG oslo_concurrency.lockutils [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "63d3b3b4-199b-4587-b7de-ed358aad629f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.569 2 DEBUG nova.compute.manager [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] No waiting events found dispatching network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.569 2 WARNING nova.compute.manager [req-03a64344-77fb-4e1d-a6b0-bb98e50219f0 req-c8fb87e8-fc17-40bc-aadd-db2e6eca1922 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received unexpected event network-vif-plugged-1727c984-f918-4a8a-880e-628b50e8dc5e for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:38:19 np0005470441 nova_compute[192626]: 2025-10-04 05:38:19.648 2 DEBUG nova.compute.manager [req-6c707371-c243-40ca-9d71-7d83a80c86eb req-fc7aca4c-9068-4dab-bbd5-91116f54fa96 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Received event network-vif-deleted-1727c984-f918-4a8a-880e-628b50e8dc5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:21 np0005470441 podman[225241]: 2025-10-04 05:38:21.315388359 +0000 UTC m=+0.066030998 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:38:21 np0005470441 podman[225240]: 2025-10-04 05:38:21.337073575 +0000 UTC m=+0.088656761 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  4 01:38:21 np0005470441 nova_compute[192626]: 2025-10-04 05:38:21.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:22 np0005470441 nova_compute[192626]: 2025-10-04 05:38:22.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:25 np0005470441 nova_compute[192626]: 2025-10-04 05:38:25.964 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556290.9620147, e1b1174b-3b77-4182-ae71-92ba2e7be833 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:38:25 np0005470441 nova_compute[192626]: 2025-10-04 05:38:25.964 2 INFO nova.compute.manager [-] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:38:25 np0005470441 nova_compute[192626]: 2025-10-04 05:38:25.994 2 DEBUG nova.compute.manager [None req-b8ab336a-a7bf-4e1a-9d31-4e38f2ed1b97 - - - - - -] [instance: e1b1174b-3b77-4182-ae71-92ba2e7be833] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:38:26 np0005470441 nova_compute[192626]: 2025-10-04 05:38:26.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:27 np0005470441 nova_compute[192626]: 2025-10-04 05:38:27.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:27 np0005470441 nova_compute[192626]: 2025-10-04 05:38:27.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:27 np0005470441 nova_compute[192626]: 2025-10-04 05:38:27.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:28 np0005470441 podman[225282]: 2025-10-04 05:38:28.310629218 +0000 UTC m=+0.066048208 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Oct  4 01:38:31 np0005470441 nova_compute[192626]: 2025-10-04 05:38:31.598 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556296.5972989, 63d3b3b4-199b-4587-b7de-ed358aad629f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:38:31 np0005470441 nova_compute[192626]: 2025-10-04 05:38:31.598 2 INFO nova.compute.manager [-] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:38:31 np0005470441 nova_compute[192626]: 2025-10-04 05:38:31.624 2 DEBUG nova.compute.manager [None req-98c1ee4f-818b-41c0-ac9f-be91bda686ac - - - - - -] [instance: 63d3b3b4-199b-4587-b7de-ed358aad629f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:38:31 np0005470441 nova_compute[192626]: 2025-10-04 05:38:31.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:32 np0005470441 podman[225302]: 2025-10-04 05:38:32.291524009 +0000 UTC m=+0.050863556 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:38:32 np0005470441 nova_compute[192626]: 2025-10-04 05:38:32.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:34 np0005470441 podman[225325]: 2025-10-04 05:38:34.330355341 +0000 UTC m=+0.079509141 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  4 01:38:36 np0005470441 nova_compute[192626]: 2025-10-04 05:38:36.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:37 np0005470441 nova_compute[192626]: 2025-10-04 05:38:37.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:38 np0005470441 podman[225344]: 2025-10-04 05:38:38.346820392 +0000 UTC m=+0.100449026 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:38:41 np0005470441 nova_compute[192626]: 2025-10-04 05:38:41.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:42 np0005470441 nova_compute[192626]: 2025-10-04 05:38:42.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:46 np0005470441 nova_compute[192626]: 2025-10-04 05:38:46.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:47 np0005470441 podman[225370]: 2025-10-04 05:38:47.311441209 +0000 UTC m=+0.058033601 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:38:47 np0005470441 podman[225371]: 2025-10-04 05:38:47.327968188 +0000 UTC m=+0.077904545 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:38:47 np0005470441 nova_compute[192626]: 2025-10-04 05:38:47.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.126 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.127 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.149 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.248 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.248 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.255 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.255 2 INFO nova.compute.claims [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.396 2 DEBUG nova.compute.provider_tree [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.412 2 DEBUG nova.scheduler.client.report [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.441 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.442 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.498 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.498 2 DEBUG nova.network.neutron [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.519 2 INFO nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.558 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.674 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.675 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.676 2 INFO nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Creating image(s)#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.676 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "/var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.677 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.677 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "/var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.690 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.743 2 DEBUG nova.policy [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '560c2ee221db4d87b04080584e8f0a48', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.777 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.779 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.779 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.797 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.852 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.853 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.888 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.889 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.889 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.945 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.946 2 DEBUG nova.virt.disk.api [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Checking if we can resize image /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:38:48 np0005470441 nova_compute[192626]: 2025-10-04 05:38:48.946 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.000 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.001 2 DEBUG nova.virt.disk.api [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Cannot resize image /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.002 2 DEBUG nova.objects.instance [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'migration_context' on Instance uuid 9e639d23-2cac-4fb4-a915-d88dfa03aad4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.027 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.027 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Ensure instance console log exists: /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.028 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.028 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.028 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:49 np0005470441 nova_compute[192626]: 2025-10-04 05:38:49.635 2 DEBUG nova.network.neutron [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Successfully created port: d58c8be4-d665-45d0-b948-8ce2d9d5fee9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:38:50 np0005470441 nova_compute[192626]: 2025-10-04 05:38:50.709 2 DEBUG nova.network.neutron [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Successfully updated port: d58c8be4-d665-45d0-b948-8ce2d9d5fee9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:38:50 np0005470441 nova_compute[192626]: 2025-10-04 05:38:50.728 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:38:50 np0005470441 nova_compute[192626]: 2025-10-04 05:38:50.728 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquired lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:38:50 np0005470441 nova_compute[192626]: 2025-10-04 05:38:50.728 2 DEBUG nova.network.neutron [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:38:50 np0005470441 nova_compute[192626]: 2025-10-04 05:38:50.898 2 DEBUG nova.network.neutron [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.804 2 DEBUG nova.network.neutron [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updating instance_info_cache with network_info: [{"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.833 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Releasing lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.833 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Instance network_info: |[{"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.835 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Start _get_guest_xml network_info=[{"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.839 2 WARNING nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.844 2 DEBUG nova.virt.libvirt.host [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.845 2 DEBUG nova.virt.libvirt.host [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.847 2 DEBUG nova.virt.libvirt.host [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.848 2 DEBUG nova.virt.libvirt.host [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.849 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.849 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.850 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.850 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.850 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.850 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.851 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.851 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.851 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.851 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.852 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.852 2 DEBUG nova.virt.hardware [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.856 2 DEBUG nova.virt.libvirt.vif [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=25,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJFqQvtFxd75j6M1dZnin3U3qhTzAMmgTO1toi3QXem/FknyiLDhCX9pbpgkbr0swLCPgRJrDUzU7KfrXpVK4pE4ajoILoN12c3kcYw8ytmq93tr9gi0XeGfb5H6w9IdPQ==',key_name='tempest-TestSecurityGroupsBasicOps-1645631158',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-w0eny6s7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:38:48Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=9e639d23-2cac-4fb4-a915-d88dfa03aad4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.856 2 DEBUG nova.network.os_vif_util [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.857 2 DEBUG nova.network.os_vif_util [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.858 2 DEBUG nova.objects.instance [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9e639d23-2cac-4fb4-a915-d88dfa03aad4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.877 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <uuid>9e639d23-2cac-4fb4-a915-d88dfa03aad4</uuid>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <name>instance-00000019</name>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344</nova:name>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:38:51</nova:creationTime>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:user uuid="560c2ee221db4d87b04080584e8f0a48">tempest-TestSecurityGroupsBasicOps-1075539829-project-member</nova:user>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:project uuid="2eaa5fc2c08b415c8c98103e044fc0a3">tempest-TestSecurityGroupsBasicOps-1075539829</nova:project>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        <nova:port uuid="d58c8be4-d665-45d0-b948-8ce2d9d5fee9">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <entry name="serial">9e639d23-2cac-4fb4-a915-d88dfa03aad4</entry>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <entry name="uuid">9e639d23-2cac-4fb4-a915-d88dfa03aad4</entry>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.config"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:f3:f3:b9"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <target dev="tapd58c8be4-d6"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/console.log" append="off"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:38:51 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:38:51 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:38:51 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:38:51 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.878 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Preparing to wait for external event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.878 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.878 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.879 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.879 2 DEBUG nova.virt.libvirt.vif [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=25,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJFqQvtFxd75j6M1dZnin3U3qhTzAMmgTO1toi3QXem/FknyiLDhCX9pbpgkbr0swLCPgRJrDUzU7KfrXpVK4pE4ajoILoN12c3kcYw8ytmq93tr9gi0XeGfb5H6w9IdPQ==',key_name='tempest-TestSecurityGroupsBasicOps-1645631158',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-w0eny6s7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:38:48Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=9e639d23-2cac-4fb4-a915-d88dfa03aad4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.880 2 DEBUG nova.network.os_vif_util [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.880 2 DEBUG nova.network.os_vif_util [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.880 2 DEBUG os_vif [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.881 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.882 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd58c8be4-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.884 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd58c8be4-d6, col_values=(('external_ids', {'iface-id': 'd58c8be4-d665-45d0-b948-8ce2d9d5fee9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:f3:b9', 'vm-uuid': '9e639d23-2cac-4fb4-a915-d88dfa03aad4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:51 np0005470441 NetworkManager[51690]: <info>  [1759556331.8863] manager: (tapd58c8be4-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.893 2 INFO os_vif [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6')#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.979 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.979 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.980 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] No VIF found with MAC fa:16:3e:f3:f3:b9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:38:51 np0005470441 podman[225433]: 2025-10-04 05:38:51.980565582 +0000 UTC m=+0.059718489 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:38:51 np0005470441 nova_compute[192626]: 2025-10-04 05:38:51.980 2 INFO nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Using config drive#033[00m
Oct  4 01:38:51 np0005470441 podman[225434]: 2025-10-04 05:38:51.985381488 +0000 UTC m=+0.062630411 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.344 2 INFO nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Creating config drive at /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.config#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.355 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjhieolw5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.467 2 DEBUG nova.compute.manager [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-changed-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.467 2 DEBUG nova.compute.manager [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Refreshing instance network info cache due to event network-changed-d58c8be4-d665-45d0-b948-8ce2d9d5fee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.468 2 DEBUG oslo_concurrency.lockutils [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.468 2 DEBUG oslo_concurrency.lockutils [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.468 2 DEBUG nova.network.neutron [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Refreshing network info cache for port d58c8be4-d665-45d0-b948-8ce2d9d5fee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.487 2 DEBUG oslo_concurrency.processutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjhieolw5" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:38:52 np0005470441 kernel: tapd58c8be4-d6: entered promiscuous mode
Oct  4 01:38:52 np0005470441 NetworkManager[51690]: <info>  [1759556332.5463] manager: (tapd58c8be4-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:52Z|00194|binding|INFO|Claiming lport d58c8be4-d665-45d0-b948-8ce2d9d5fee9 for this chassis.
Oct  4 01:38:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:52Z|00195|binding|INFO|d58c8be4-d665-45d0-b948-8ce2d9d5fee9: Claiming fa:16:3e:f3:f3:b9 10.100.0.12
Oct  4 01:38:52 np0005470441 systemd-udevd[225487]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:38:52 np0005470441 NetworkManager[51690]: <info>  [1759556332.5935] device (tapd58c8be4-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:38:52 np0005470441 NetworkManager[51690]: <info>  [1759556332.5951] device (tapd58c8be4-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:38:52 np0005470441 systemd-machined[152624]: New machine qemu-13-instance-00000019.
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.599 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:f3:b9 10.100.0.12'], port_security=['fa:16:3e:f3:f3:b9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9e639d23-2cac-4fb4-a915-d88dfa03aad4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a701a32e-22f0-4c4c-9e93-6922edca50cc a986649d-a7b9-49d5-bf85-6fdc469b0305', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b622aac-753c-4b0e-8346-ec91ed2d23cf, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=d58c8be4-d665-45d0-b948-8ce2d9d5fee9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.601 103689 INFO neutron.agent.ovn.metadata.agent [-] Port d58c8be4-d665-45d0-b948-8ce2d9d5fee9 in datapath 9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b bound to our chassis#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.602 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.615 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3537ae3f-c4c3-4a81-97fb-3ee73f6c4f0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.616 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9656f2f4-21 in ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.618 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9656f2f4-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.618 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ccd5a38c-c602-4714-b6ba-ac89fa2b18b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.619 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8a9731-e14b-4ef9-8bf9-1cb174da0f6a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:52Z|00196|binding|INFO|Setting lport d58c8be4-d665-45d0-b948-8ce2d9d5fee9 ovn-installed in OVS
Oct  4 01:38:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:52Z|00197|binding|INFO|Setting lport d58c8be4-d665-45d0-b948-8ce2d9d5fee9 up in Southbound
Oct  4 01:38:52 np0005470441 systemd[1]: Started Virtual Machine qemu-13-instance-00000019.
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.631 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[283462f3-f417-400d-bb09-28dfd747a24d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.654 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[56956af1-a20d-4fe4-bc9c-67565c0700f6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.684 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[084ac60d-6b57-49fb-93e6-05759a5afff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 systemd-udevd[225492]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:38:52 np0005470441 NetworkManager[51690]: <info>  [1759556332.6915] manager: (tap9656f2f4-20): new Veth device (/org/freedesktop/NetworkManager/Devices/84)
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.690 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[095a50d1-69cb-4e97-94b4-8c4a10309ab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.720 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[f972e179-90cb-4a80-9d30-7794e91649bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.722 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a811194d-901d-483d-8592-84061c7e6861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 NetworkManager[51690]: <info>  [1759556332.7465] device (tap9656f2f4-20): carrier: link connected
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.753 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[0f9009b7-47a2-4d9f-9cc6-df583d90e89c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.772 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3fecf988-34c6-436a-a245-d406901e3aad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9656f2f4-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:26:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423040, 'reachable_time': 31695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225523, 'error': None, 'target': 'ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.788 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[12a292d3-417f-4544-bc26-335514446492]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe4b:2618'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 423040, 'tstamp': 423040}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225524, 'error': None, 'target': 'ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.804 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[006db6d8-3f1d-43b5-91ee-83d666475a95]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9656f2f4-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:4b:26:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 48], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423040, 'reachable_time': 31695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225525, 'error': None, 'target': 'ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.830 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bf3b1d80-39ca-4650-9759-17ab21132d9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.880 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba4fe15-294b-4966-95c0-f6a7bb566f02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.882 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9656f2f4-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.882 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.882 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9656f2f4-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:52 np0005470441 NetworkManager[51690]: <info>  [1759556332.8858] manager: (tap9656f2f4-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct  4 01:38:52 np0005470441 kernel: tap9656f2f4-20: entered promiscuous mode
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.888 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9656f2f4-20, col_values=(('external_ids', {'iface-id': 'c4228798-92d3-4f27-b6da-a72a96a277b2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:38:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:52Z|00198|binding|INFO|Releasing lport c4228798-92d3-4f27-b6da-a72a96a277b2 from this chassis (sb_readonly=0)
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.890 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.891 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b7598736-0bc2-45bc-96b4-dbc812287699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.892 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b.pid.haproxy
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:38:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:38:52.892 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'env', 'PROCESS_TAG=haproxy-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:38:52 np0005470441 nova_compute[192626]: 2025-10-04 05:38:52.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.026 2 DEBUG nova.compute.manager [req-5b6eb419-4ce7-4c60-861e-2ad7b3165d7f req-4ddd6edb-52e6-4c17-8ebe-7723e2e5d0f4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.026 2 DEBUG oslo_concurrency.lockutils [req-5b6eb419-4ce7-4c60-861e-2ad7b3165d7f req-4ddd6edb-52e6-4c17-8ebe-7723e2e5d0f4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.027 2 DEBUG oslo_concurrency.lockutils [req-5b6eb419-4ce7-4c60-861e-2ad7b3165d7f req-4ddd6edb-52e6-4c17-8ebe-7723e2e5d0f4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.027 2 DEBUG oslo_concurrency.lockutils [req-5b6eb419-4ce7-4c60-861e-2ad7b3165d7f req-4ddd6edb-52e6-4c17-8ebe-7723e2e5d0f4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.027 2 DEBUG nova.compute.manager [req-5b6eb419-4ce7-4c60-861e-2ad7b3165d7f req-4ddd6edb-52e6-4c17-8ebe-7723e2e5d0f4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Processing event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:38:53 np0005470441 podman[225564]: 2025-10-04 05:38:53.280418568 +0000 UTC m=+0.072972806 container create 7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:38:53 np0005470441 podman[225564]: 2025-10-04 05:38:53.228300356 +0000 UTC m=+0.020854584 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:38:53 np0005470441 systemd[1]: Started libpod-conmon-7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42.scope.
Oct  4 01:38:53 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:38:53 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f56e16f94837edda1da7c908d42f5fc1b5c645731011d174a21a6a7ab3ebc8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:38:53 np0005470441 podman[225564]: 2025-10-04 05:38:53.371775624 +0000 UTC m=+0.164329852 container init 7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  4 01:38:53 np0005470441 podman[225564]: 2025-10-04 05:38:53.377185318 +0000 UTC m=+0.169739526 container start 7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:38:53 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [NOTICE]   (225583) : New worker (225585) forked
Oct  4 01:38:53 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [NOTICE]   (225583) : Loading success.
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.641 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.642 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556333.6406324, 9e639d23-2cac-4fb4-a915-d88dfa03aad4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.643 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] VM Started (Lifecycle Event)#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.647 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.650 2 INFO nova.virt.libvirt.driver [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Instance spawned successfully.#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.650 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.672 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.681 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.687 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.687 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.688 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.688 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.689 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.689 2 DEBUG nova.virt.libvirt.driver [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.726 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.727 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556333.6419368, 9e639d23-2cac-4fb4-a915-d88dfa03aad4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.727 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.774 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.777 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556333.6464007, 9e639d23-2cac-4fb4-a915-d88dfa03aad4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.777 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.789 2 INFO nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Took 5.11 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.789 2 DEBUG nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.799 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.802 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.837 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.868 2 INFO nova.compute.manager [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Took 5.66 seconds to build instance.#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.892 2 DEBUG oslo_concurrency.lockutils [None req-15cfd7ac-e41c-4052-9f34-9064df2bc140 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.945 2 DEBUG nova.network.neutron [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updated VIF entry in instance network info cache for port d58c8be4-d665-45d0-b948-8ce2d9d5fee9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.945 2 DEBUG nova.network.neutron [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updating instance_info_cache with network_info: [{"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:38:53 np0005470441 nova_compute[192626]: 2025-10-04 05:38:53.964 2 DEBUG oslo_concurrency.lockutils [req-11261770-b0ec-4fe7-99f0-43752ee1e70a req-e30f188c-8af1-4451-91b4-f22a508c8697 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:38:55 np0005470441 nova_compute[192626]: 2025-10-04 05:38:55.112 2 DEBUG nova.compute.manager [req-ff3c05a9-3aa5-444a-9b18-f142ccd6b6cf req-44f1d92b-1c34-4f6c-8a1a-9f8783168d56 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:55 np0005470441 nova_compute[192626]: 2025-10-04 05:38:55.113 2 DEBUG oslo_concurrency.lockutils [req-ff3c05a9-3aa5-444a-9b18-f142ccd6b6cf req-44f1d92b-1c34-4f6c-8a1a-9f8783168d56 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:38:55 np0005470441 nova_compute[192626]: 2025-10-04 05:38:55.113 2 DEBUG oslo_concurrency.lockutils [req-ff3c05a9-3aa5-444a-9b18-f142ccd6b6cf req-44f1d92b-1c34-4f6c-8a1a-9f8783168d56 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:38:55 np0005470441 nova_compute[192626]: 2025-10-04 05:38:55.114 2 DEBUG oslo_concurrency.lockutils [req-ff3c05a9-3aa5-444a-9b18-f142ccd6b6cf req-44f1d92b-1c34-4f6c-8a1a-9f8783168d56 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:38:55 np0005470441 nova_compute[192626]: 2025-10-04 05:38:55.114 2 DEBUG nova.compute.manager [req-ff3c05a9-3aa5-444a-9b18-f142ccd6b6cf req-44f1d92b-1c34-4f6c-8a1a-9f8783168d56 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] No waiting events found dispatching network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:38:55 np0005470441 nova_compute[192626]: 2025-10-04 05:38:55.114 2 WARNING nova.compute.manager [req-ff3c05a9-3aa5-444a-9b18-f142ccd6b6cf req-44f1d92b-1c34-4f6c-8a1a-9f8783168d56 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received unexpected event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:38:56 np0005470441 nova_compute[192626]: 2025-10-04 05:38:56.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:57 np0005470441 nova_compute[192626]: 2025-10-04 05:38:57.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:57 np0005470441 NetworkManager[51690]: <info>  [1759556337.5436] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct  4 01:38:57 np0005470441 NetworkManager[51690]: <info>  [1759556337.5447] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct  4 01:38:57 np0005470441 nova_compute[192626]: 2025-10-04 05:38:57.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:57 np0005470441 nova_compute[192626]: 2025-10-04 05:38:57.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:38:57Z|00199|binding|INFO|Releasing lport c4228798-92d3-4f27-b6da-a72a96a277b2 from this chassis (sb_readonly=0)
Oct  4 01:38:57 np0005470441 nova_compute[192626]: 2025-10-04 05:38:57.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:38:57 np0005470441 nova_compute[192626]: 2025-10-04 05:38:57.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:38:59 np0005470441 podman[225595]: 2025-10-04 05:38:59.333630532 +0000 UTC m=+0.080379526 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  4 01:38:59 np0005470441 nova_compute[192626]: 2025-10-04 05:38:59.426 2 DEBUG nova.compute.manager [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-changed-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:38:59 np0005470441 nova_compute[192626]: 2025-10-04 05:38:59.426 2 DEBUG nova.compute.manager [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Refreshing instance network info cache due to event network-changed-d58c8be4-d665-45d0-b948-8ce2d9d5fee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:38:59 np0005470441 nova_compute[192626]: 2025-10-04 05:38:59.427 2 DEBUG oslo_concurrency.lockutils [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:38:59 np0005470441 nova_compute[192626]: 2025-10-04 05:38:59.427 2 DEBUG oslo_concurrency.lockutils [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:38:59 np0005470441 nova_compute[192626]: 2025-10-04 05:38:59.427 2 DEBUG nova.network.neutron [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Refreshing network info cache for port d58c8be4-d665-45d0-b948-8ce2d9d5fee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:38:59 np0005470441 nova_compute[192626]: 2025-10-04 05:38:59.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.744 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.830 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.892 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.893 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:00 np0005470441 nova_compute[192626]: 2025-10-04 05:39:00.954 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.109 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.111 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5635MB free_disk=73.46476745605469GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.111 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.111 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.197 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 9e639d23-2cac-4fb4-a915-d88dfa03aad4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.197 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.197 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.250 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.268 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.293 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.293 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:01 np0005470441 nova_compute[192626]: 2025-10-04 05:39:01.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.289 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.749 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.750 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:02 np0005470441 nova_compute[192626]: 2025-10-04 05:39:02.750 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:39:03 np0005470441 podman[225622]: 2025-10-04 05:39:03.329430495 +0000 UTC m=+0.075943478 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:39:03 np0005470441 nova_compute[192626]: 2025-10-04 05:39:03.715 2 DEBUG nova.network.neutron [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updated VIF entry in instance network info cache for port d58c8be4-d665-45d0-b948-8ce2d9d5fee9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:39:03 np0005470441 nova_compute[192626]: 2025-10-04 05:39:03.716 2 DEBUG nova.network.neutron [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updating instance_info_cache with network_info: [{"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:03 np0005470441 nova_compute[192626]: 2025-10-04 05:39:03.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:03 np0005470441 nova_compute[192626]: 2025-10-04 05:39:03.740 2 DEBUG oslo_concurrency.lockutils [req-05b04ff2-3da8-40bf-be84-3766059ff69a req-4c790901-7685-419f-855c-25675352a2ad 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:05 np0005470441 podman[225653]: 2025-10-04 05:39:05.301891551 +0000 UTC m=+0.056447846 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  4 01:39:05 np0005470441 nova_compute[192626]: 2025-10-04 05:39:05.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:05 np0005470441 nova_compute[192626]: 2025-10-04 05:39:05.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:06 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:06Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f3:f3:b9 10.100.0.12
Oct  4 01:39:06 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:06Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:f3:b9 10.100.0.12
Oct  4 01:39:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:06.746 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:06.748 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:06.749 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:06 np0005470441 nova_compute[192626]: 2025-10-04 05:39:06.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:07 np0005470441 nova_compute[192626]: 2025-10-04 05:39:07.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:09 np0005470441 podman[225682]: 2025-10-04 05:39:09.354315656 +0000 UTC m=+0.105307524 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:39:09 np0005470441 nova_compute[192626]: 2025-10-04 05:39:09.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:11 np0005470441 nova_compute[192626]: 2025-10-04 05:39:11.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:12 np0005470441 nova_compute[192626]: 2025-10-04 05:39:12.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:16Z|00200|binding|INFO|Releasing lport c4228798-92d3-4f27-b6da-a72a96a277b2 from this chassis (sb_readonly=0)
Oct  4 01:39:16 np0005470441 nova_compute[192626]: 2025-10-04 05:39:16.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:16 np0005470441 nova_compute[192626]: 2025-10-04 05:39:16.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:17 np0005470441 nova_compute[192626]: 2025-10-04 05:39:17.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:18.150 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:39:18 np0005470441 nova_compute[192626]: 2025-10-04 05:39:18.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:18.152 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:39:18 np0005470441 podman[225708]: 2025-10-04 05:39:18.311390377 +0000 UTC m=+0.061069437 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:39:18 np0005470441 podman[225709]: 2025-10-04 05:39:18.311581582 +0000 UTC m=+0.054691725 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:39:21 np0005470441 nova_compute[192626]: 2025-10-04 05:39:21.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:22 np0005470441 podman[225751]: 2025-10-04 05:39:22.315450287 +0000 UTC m=+0.062395055 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Oct  4 01:39:22 np0005470441 podman[225750]: 2025-10-04 05:39:22.328715324 +0000 UTC m=+0.071513304 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid)
Oct  4 01:39:22 np0005470441 nova_compute[192626]: 2025-10-04 05:39:22.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:23 np0005470441 nova_compute[192626]: 2025-10-04 05:39:23.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:25.153 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:26 np0005470441 nova_compute[192626]: 2025-10-04 05:39:26.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:27 np0005470441 nova_compute[192626]: 2025-10-04 05:39:27.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:30 np0005470441 podman[225791]: 2025-10-04 05:39:30.300284044 +0000 UTC m=+0.058502854 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct  4 01:39:32 np0005470441 nova_compute[192626]: 2025-10-04 05:39:32.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:32 np0005470441 nova_compute[192626]: 2025-10-04 05:39:32.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:34 np0005470441 podman[225812]: 2025-10-04 05:39:34.299299129 +0000 UTC m=+0.050352112 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:39:34 np0005470441 nova_compute[192626]: 2025-10-04 05:39:34.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:36 np0005470441 podman[225836]: 2025-10-04 05:39:36.304322048 +0000 UTC m=+0.057177256 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:39:37 np0005470441 nova_compute[192626]: 2025-10-04 05:39:37.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:37 np0005470441 nova_compute[192626]: 2025-10-04 05:39:37.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.191 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.191 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.221 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.235 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.236 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.264 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.351 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.351 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.360 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.361 2 INFO nova.compute.claims [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.377 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.507 2 DEBUG nova.compute.provider_tree [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.526 2 DEBUG nova.scheduler.client.report [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.562 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.563 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.567 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.574 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.574 2 INFO nova.compute.claims [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.654 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.654 2 DEBUG nova.network.neutron [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.684 2 INFO nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.707 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.817 2 DEBUG nova.compute.provider_tree [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.824 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.825 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.825 2 INFO nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Creating image(s)#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.826 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.826 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.827 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.839 2 DEBUG nova.scheduler.client.report [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.842 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.867 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.868 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.902 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.902 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.903 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.919 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.950 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.951 2 DEBUG nova.network.neutron [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.974 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.974 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:39 np0005470441 nova_compute[192626]: 2025-10-04 05:39:39.991 2 INFO nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.008 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.009 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.009 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.028 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.066 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.067 2 DEBUG nova.virt.disk.api [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Checking if we can resize image /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.067 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.126 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.127 2 DEBUG nova.virt.disk.api [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Cannot resize image /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.127 2 DEBUG nova.objects.instance [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'migration_context' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.135 2 DEBUG nova.policy [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.154 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.155 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Ensure instance console log exists: /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.155 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.156 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.156 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.157 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.158 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.158 2 INFO nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Creating image(s)#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.159 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.159 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.160 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.172 2 DEBUG nova.policy [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.174 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.229 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.230 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.230 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.241 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.309 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.311 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 podman[225871]: 2025-10-04 05:39:40.343316561 +0000 UTC m=+0.093486518 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.346 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.347 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.348 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.407 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.409 2 DEBUG nova.virt.disk.api [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.410 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.466 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.468 2 DEBUG nova.virt.disk.api [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.469 2 DEBUG nova.objects.instance [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 369c5da6-9c6d-48e7-a402-88f996ed8276 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.488 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.488 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Ensure instance console log exists: /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.489 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.490 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:40 np0005470441 nova_compute[192626]: 2025-10-04 05:39:40.490 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:41 np0005470441 nova_compute[192626]: 2025-10-04 05:39:41.845 2 DEBUG nova.network.neutron [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Successfully created port: ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:39:41 np0005470441 nova_compute[192626]: 2025-10-04 05:39:41.872 2 DEBUG nova.network.neutron [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Successfully created port: 1c32865e-e189-4f96-b7d5-f3c3a5136407 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:39:42 np0005470441 nova_compute[192626]: 2025-10-04 05:39:42.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:42 np0005470441 nova_compute[192626]: 2025-10-04 05:39:42.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.667 2 DEBUG nova.network.neutron [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Successfully updated port: ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.688 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.689 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquired lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.689 2 DEBUG nova.network.neutron [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.706 2 DEBUG nova.network.neutron [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Successfully updated port: 1c32865e-e189-4f96-b7d5-f3c3a5136407 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.722 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.723 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.723 2 DEBUG nova.network.neutron [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.798 2 DEBUG nova.compute.manager [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-changed-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.799 2 DEBUG nova.compute.manager [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Refreshing instance network info cache due to event network-changed-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.800 2 DEBUG oslo_concurrency.lockutils [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.874 2 DEBUG nova.network.neutron [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:39:43 np0005470441 nova_compute[192626]: 2025-10-04 05:39:43.882 2 DEBUG nova.network.neutron [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.078 2 DEBUG nova.network.neutron [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Updating instance_info_cache with network_info: [{"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.092 2 DEBUG nova.network.neutron [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updating instance_info_cache with network_info: [{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.110 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.110 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Instance network_info: |[{"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.112 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Start _get_guest_xml network_info=[{"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.116 2 WARNING nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.117 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Releasing lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.117 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance network_info: |[{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.118 2 DEBUG oslo_concurrency.lockutils [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.118 2 DEBUG nova.network.neutron [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Refreshing network info cache for port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.120 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Start _get_guest_xml network_info=[{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.123 2 WARNING nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.124 2 DEBUG nova.virt.libvirt.host [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.125 2 DEBUG nova.virt.libvirt.host [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.128 2 DEBUG nova.virt.libvirt.host [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.128 2 DEBUG nova.virt.libvirt.host [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.129 2 DEBUG nova.virt.libvirt.host [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.129 2 DEBUG nova.virt.libvirt.host [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.130 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.131 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.131 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.131 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.132 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.132 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.132 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.132 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.132 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.133 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.133 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.133 2 DEBUG nova.virt.hardware [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.136 2 DEBUG nova.virt.libvirt.vif [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-971816448',display_name='tempest-TestNetworkBasicOps-server-971816448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-971816448',id=29,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHibsOavvT5Q22UxMUeFHlMlfvZtwgbPzWkMoHbvilBwdbM4rdSMfjdqdv+XQ5xhGsZ0gFbusTh5D97rwOcKuML1QRTuZJX9N8yVKv1zSUJMFdE0q9S5hSidpnOFsu8Dfg==',key_name='tempest-TestNetworkBasicOps-778878579',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-1vg8r9z1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:39:40Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=369c5da6-9c6d-48e7-a402-88f996ed8276,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.136 2 DEBUG nova.network.os_vif_util [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.137 2 DEBUG nova.network.os_vif_util [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.138 2 DEBUG nova.objects.instance [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 369c5da6-9c6d-48e7-a402-88f996ed8276 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.144 2 DEBUG nova.virt.libvirt.host [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.144 2 DEBUG nova.virt.libvirt.host [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.145 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.145 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.146 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.146 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.146 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.146 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.147 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.147 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.147 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.147 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.147 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.148 2 DEBUG nova.virt.hardware [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.150 2 DEBUG nova.virt.libvirt.vif [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-506251172',display_name='tempest-TestNetworkAdvancedServerOps-server-506251172',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-506251172',id=28,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIjXhDdMZBitPDdhOijnlDosYzWTa95Sosui3/U7Aj23EXkPreyDXr77ZxqvYSkIYSs4SsfMo+dHVtQsDAqtEMSF48ZFb97HgEie6xjWesHmfe4SD9fho4cwWF6eCwTY/g==',key_name='tempest-TestNetworkAdvancedServerOps-512019839',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-pop27zoe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:39:39Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.150 2 DEBUG nova.network.os_vif_util [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.151 2 DEBUG nova.network.os_vif_util [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.152 2 DEBUG nova.objects.instance [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.154 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <uuid>369c5da6-9c6d-48e7-a402-88f996ed8276</uuid>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <name>instance-0000001d</name>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-971816448</nova:name>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:39:45</nova:creationTime>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:port uuid="1c32865e-e189-4f96-b7d5-f3c3a5136407">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="serial">369c5da6-9c6d-48e7-a402-88f996ed8276</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="uuid">369c5da6-9c6d-48e7-a402-88f996ed8276</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.config"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:44:b1:43"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <target dev="tap1c32865e-e1"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/console.log" append="off"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:39:45 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:39:45 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.155 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Preparing to wait for external event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.155 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.155 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.155 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.156 2 DEBUG nova.virt.libvirt.vif [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-971816448',display_name='tempest-TestNetworkBasicOps-server-971816448',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-971816448',id=29,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHibsOavvT5Q22UxMUeFHlMlfvZtwgbPzWkMoHbvilBwdbM4rdSMfjdqdv+XQ5xhGsZ0gFbusTh5D97rwOcKuML1QRTuZJX9N8yVKv1zSUJMFdE0q9S5hSidpnOFsu8Dfg==',key_name='tempest-TestNetworkBasicOps-778878579',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-1vg8r9z1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:39:40Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=369c5da6-9c6d-48e7-a402-88f996ed8276,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.156 2 DEBUG nova.network.os_vif_util [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.157 2 DEBUG nova.network.os_vif_util [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.157 2 DEBUG os_vif [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.158 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.159 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.161 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1c32865e-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.161 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1c32865e-e1, col_values=(('external_ids', {'iface-id': '1c32865e-e189-4f96-b7d5-f3c3a5136407', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:b1:43', 'vm-uuid': '369c5da6-9c6d-48e7-a402-88f996ed8276'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 NetworkManager[51690]: <info>  [1759556385.1636] manager: (tap1c32865e-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.168 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <uuid>1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016</uuid>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <name>instance-0000001c</name>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-506251172</nova:name>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:39:45</nova:creationTime>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:user uuid="d65c768451494a3f9e4f9a238fa5c40d">tempest-TestNetworkAdvancedServerOps-1635331179-project-member</nova:user>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:project uuid="d0c087ea0f62444e80490916b42c760f">tempest-TestNetworkAdvancedServerOps-1635331179</nova:project>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        <nova:port uuid="ea5d5ce9-b8c0-45ae-8462-bfa1288280c9">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="serial">1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="uuid">1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:94:6c:37"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <target dev="tapea5d5ce9-b8"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/console.log" append="off"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:39:45 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:39:45 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:39:45 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:39:45 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.169 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Preparing to wait for external event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.169 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.169 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.170 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.170 2 DEBUG nova.virt.libvirt.vif [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-506251172',display_name='tempest-TestNetworkAdvancedServerOps-server-506251172',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-506251172',id=28,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIjXhDdMZBitPDdhOijnlDosYzWTa95Sosui3/U7Aj23EXkPreyDXr77ZxqvYSkIYSs4SsfMo+dHVtQsDAqtEMSF48ZFb97HgEie6xjWesHmfe4SD9fho4cwWF6eCwTY/g==',key_name='tempest-TestNetworkAdvancedServerOps-512019839',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-pop27zoe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:39:39Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.171 2 DEBUG nova.network.os_vif_util [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.171 2 DEBUG nova.network.os_vif_util [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.172 2 DEBUG os_vif [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.173 2 INFO os_vif [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1')#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea5d5ce9-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea5d5ce9-b8, col_values=(('external_ids', {'iface-id': 'ea5d5ce9-b8c0-45ae-8462-bfa1288280c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:6c:37', 'vm-uuid': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 NetworkManager[51690]: <info>  [1759556385.1802] manager: (tapea5d5ce9-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.185 2 INFO os_vif [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8')#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.236 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.237 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.237 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:44:b1:43, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.237 2 INFO nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Using config drive#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.241 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.241 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.241 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No VIF found with MAC fa:16:3e:94:6c:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.242 2 INFO nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Using config drive#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.894 2 DEBUG nova.compute.manager [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-changed-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.895 2 DEBUG nova.compute.manager [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Refreshing instance network info cache due to event network-changed-1c32865e-e189-4f96-b7d5-f3c3a5136407. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.895 2 DEBUG oslo_concurrency.lockutils [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.895 2 DEBUG oslo_concurrency.lockutils [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:45 np0005470441 nova_compute[192626]: 2025-10-04 05:39:45.895 2 DEBUG nova.network.neutron [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Refreshing network info cache for port 1c32865e-e189-4f96-b7d5-f3c3a5136407 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.717 2 INFO nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Creating config drive at /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.config#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.722 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_4xax5e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.762 2 INFO nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Creating config drive at /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.768 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2wmg0dv4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.864 2 DEBUG oslo_concurrency.processutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_4xax5e" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.890 2 DEBUG oslo_concurrency.processutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2wmg0dv4" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:39:47 np0005470441 NetworkManager[51690]: <info>  [1759556387.9231] manager: (tap1c32865e-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct  4 01:39:47 np0005470441 kernel: tap1c32865e-e1: entered promiscuous mode
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00201|binding|INFO|Claiming lport 1c32865e-e189-4f96-b7d5-f3c3a5136407 for this chassis.
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00202|binding|INFO|1c32865e-e189-4f96-b7d5-f3c3a5136407: Claiming fa:16:3e:44:b1:43 10.100.0.8
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.942 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:b1:43 10.100.0.8'], port_security=['fa:16:3e:44:b1:43 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': '544dc3d9-8b22-4833-9188-b1166a076883', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=81506072-2e1b-4219-b643-c8187215261f, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1c32865e-e189-4f96-b7d5-f3c3a5136407) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.944 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1c32865e-e189-4f96-b7d5-f3c3a5136407 in datapath 04999c96-51a9-44fc-b4c8-a6213c9bc268 bound to our chassis#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.945 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 04999c96-51a9-44fc-b4c8-a6213c9bc268#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.956 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f8d5df01-4ba2-4086-ae50-533e5d42236e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.957 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap04999c96-51 in ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00203|binding|INFO|Setting lport 1c32865e-e189-4f96-b7d5-f3c3a5136407 ovn-installed in OVS
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00204|binding|INFO|Setting lport 1c32865e-e189-4f96-b7d5-f3c3a5136407 up in Southbound
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.958 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap04999c96-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.958 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0f8ed162-ccf2-4f59-a462-d45e9d147c73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.959 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1af149ef-b41e-483c-82fa-51904d45bda0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:47 np0005470441 systemd-udevd[225947]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:39:47 np0005470441 kernel: tapea5d5ce9-b8: entered promiscuous mode
Oct  4 01:39:47 np0005470441 NetworkManager[51690]: <info>  [1759556387.9653] manager: (tapea5d5ce9-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Oct  4 01:39:47 np0005470441 systemd-udevd[225951]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:39:47 np0005470441 systemd-machined[152624]: New machine qemu-14-instance-0000001d.
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.971 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[dbfe7456-8afb-47b6-bbad-e6c9d44ac39f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00205|binding|INFO|Claiming lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for this chassis.
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00206|binding|INFO|ea5d5ce9-b8c0-45ae-8462-bfa1288280c9: Claiming fa:16:3e:94:6c:37 10.100.0.8
Oct  4 01:39:47 np0005470441 NetworkManager[51690]: <info>  [1759556387.9789] device (tap1c32865e-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:39:47 np0005470441 NetworkManager[51690]: <info>  [1759556387.9796] device (tap1c32865e-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:39:47 np0005470441 NetworkManager[51690]: <info>  [1759556387.9821] device (tapea5d5ce9-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:39:47 np0005470441 NetworkManager[51690]: <info>  [1759556387.9827] device (tapea5d5ce9-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:39:47 np0005470441 systemd[1]: Started Virtual Machine qemu-14-instance-0000001d.
Oct  4 01:39:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:47.991 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:6c:37 10.100.0.8'], port_security=['fa:16:3e:94:6c:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1116f56-9520-48d8-8bb2-2519f97b3338', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '48042eb9-5c9a-49d3-9ddf-88f8ff74c14b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67ce5926-63fc-4d20-a2ee-8b5c0eb6e716, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00207|binding|INFO|Setting lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 ovn-installed in OVS
Oct  4 01:39:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:47Z|00208|binding|INFO|Setting lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 up in Southbound
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:47 np0005470441 nova_compute[192626]: 2025-10-04 05:39:47.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.001 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[378cd241-51b2-469a-9d7f-efecee6f1134]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 systemd-machined[152624]: New machine qemu-15-instance-0000001c.
Oct  4 01:39:48 np0005470441 systemd[1]: Started Virtual Machine qemu-15-instance-0000001c.
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.030 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[aef8149d-0448-4380-b7d9-26514e969017]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 NetworkManager[51690]: <info>  [1759556388.0368] manager: (tap04999c96-50): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.036 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c88de755-b6d0-495a-a18e-55b5f4a589ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.069 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[b832639a-17e9-40db-b29c-7ac1ede1387a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.072 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[eab1be43-7915-4c0f-b32a-fa524f03991b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 NetworkManager[51690]: <info>  [1759556388.0973] device (tap04999c96-50): carrier: link connected
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.105 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c488ccae-4cd3-4173-ad59-b707891c379b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.122 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e41e6ba9-0aa5-4bbd-ae83-36c73e819a74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04999c96-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:bf:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428575, 'reachable_time': 15476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225991, 'error': None, 'target': 'ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.137 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3180a9-7cf7-4b06-9e7c-c7407edadbae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:bf9f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428575, 'tstamp': 428575}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225992, 'error': None, 'target': 'ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.157 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[693017c6-759a-465e-989c-6df2ba178a32]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap04999c96-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:bf:9f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428575, 'reachable_time': 15476, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225993, 'error': None, 'target': 'ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.184 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1d9976-e02b-4411-93f8-c639a45bc848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.240 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8bcc686d-e931-40b1-a3bf-c4653e671a7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.242 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04999c96-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.243 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.243 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04999c96-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:48 np0005470441 kernel: tap04999c96-50: entered promiscuous mode
Oct  4 01:39:48 np0005470441 NetworkManager[51690]: <info>  [1759556388.2465] manager: (tap04999c96-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.249 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap04999c96-50, col_values=(('external_ids', {'iface-id': 'ac619930-3558-40f2-b142-298dd722addb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:48Z|00209|binding|INFO|Releasing lport ac619930-3558-40f2-b142-298dd722addb from this chassis (sb_readonly=0)
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.253 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/04999c96-51a9-44fc-b4c8-a6213c9bc268.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/04999c96-51a9-44fc-b4c8-a6213c9bc268.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.255 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[07ea7add-3fd5-4f14-ac40-dabcfb54fecb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.256 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-04999c96-51a9-44fc-b4c8-a6213c9bc268
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/04999c96-51a9-44fc-b4c8-a6213c9bc268.pid.haproxy
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 04999c96-51a9-44fc-b4c8-a6213c9bc268
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.259 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'env', 'PROCESS_TAG=haproxy-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/04999c96-51a9-44fc-b4c8-a6213c9bc268.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:48 np0005470441 podman[226037]: 2025-10-04 05:39:48.662581935 +0000 UTC m=+0.065648997 container create afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:39:48 np0005470441 systemd[1]: Started libpod-conmon-afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832.scope.
Oct  4 01:39:48 np0005470441 podman[226037]: 2025-10-04 05:39:48.619916572 +0000 UTC m=+0.022983624 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:39:48 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:39:48 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17752b182b67c71958fc326891b03996b72c2689dcd3477be31e7a9b81d22e8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:39:48 np0005470441 podman[226037]: 2025-10-04 05:39:48.740240262 +0000 UTC m=+0.143307314 container init afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true)
Oct  4 01:39:48 np0005470441 podman[226037]: 2025-10-04 05:39:48.746352566 +0000 UTC m=+0.149419588 container start afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:39:48 np0005470441 podman[226053]: 2025-10-04 05:39:48.756443883 +0000 UTC m=+0.058003150 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:39:48 np0005470441 podman[226050]: 2025-10-04 05:39:48.758166022 +0000 UTC m=+0.059017509 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:39:48 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [NOTICE]   (226087) : New worker (226097) forked
Oct  4 01:39:48 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [NOTICE]   (226087) : Loading success.
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.816 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556388.8152916, 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.817 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] VM Started (Lifecycle Event)#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.818 103689 INFO neutron.agent.ovn.metadata.agent [-] Port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 in datapath f1116f56-9520-48d8-8bb2-2519f97b3338 unbound from our chassis#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.820 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1116f56-9520-48d8-8bb2-2519f97b3338#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.830 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[257e3929-659d-40d0-909b-89a70be6a54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.831 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf1116f56-91 in ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.833 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf1116f56-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.833 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9d3b6e98-01db-4da2-a4c2-f634a95d18ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.833 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[31c0f3af-e1b0-44d8-aa24-377f5d4dfc46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.844 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[452fc08a-e658-4de8-b235-a70a14e7bf6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.851 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.855 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556388.8154683, 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.856 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.870 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[10a6e1da-b81d-4397-9add-5a055f79970c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.881 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.885 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.898 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[3bad6f1c-9d93-4eef-935d-3d7e182f7fb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.903 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[07b98ef8-6c78-4fe7-8c1c-08085f22b5f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 NetworkManager[51690]: <info>  [1759556388.9043] manager: (tapf1116f56-90): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.906 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.907 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556388.8564267, 369c5da6-9c6d-48e7-a402-88f996ed8276 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.908 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] VM Started (Lifecycle Event)#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.931 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[64c0847f-0c4c-4de9-b199-925cc6e8f97d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.935 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.937 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[caa1f677-fa1f-4b33-a4a1-2f97cdc63f50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.940 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556388.856467, 369c5da6-9c6d-48e7-a402-88f996ed8276 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.940 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:39:48 np0005470441 NetworkManager[51690]: <info>  [1759556388.9598] device (tapf1116f56-90): carrier: link connected
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.960 2 DEBUG nova.compute.manager [req-809f78f3-a587-4cba-8c10-7c60d5465276 req-d3cbe793-b764-436a-8acd-3d99e3d607e8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.960 2 DEBUG oslo_concurrency.lockutils [req-809f78f3-a587-4cba-8c10-7c60d5465276 req-d3cbe793-b764-436a-8acd-3d99e3d607e8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.960 2 DEBUG oslo_concurrency.lockutils [req-809f78f3-a587-4cba-8c10-7c60d5465276 req-d3cbe793-b764-436a-8acd-3d99e3d607e8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.961 2 DEBUG oslo_concurrency.lockutils [req-809f78f3-a587-4cba-8c10-7c60d5465276 req-d3cbe793-b764-436a-8acd-3d99e3d607e8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.961 2 DEBUG nova.compute.manager [req-809f78f3-a587-4cba-8c10-7c60d5465276 req-d3cbe793-b764-436a-8acd-3d99e3d607e8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Processing event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.962 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.969 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.968 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb3daf9-4dee-4a6f-9ff4-bd172bce4b48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.972 2 INFO nova.virt.libvirt.driver [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Instance spawned successfully.#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.973 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:39:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:48.989 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[467c32ea-f251-4213-89b0-d38fd32e45da]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1116f56-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:0b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428661, 'reachable_time': 42876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226117, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.990 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.996 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556388.9644096, 369c5da6-9c6d-48e7-a402-88f996ed8276 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.997 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:39:48 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.999 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:48.999 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.000 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.000 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.001 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.001 2 DEBUG nova.virt.libvirt.driver [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.005 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f01659f5-473c-4cce-9e90-1f3313d49e8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 428661, 'tstamp': 428661}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226118, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.022 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1d01398d-84fa-4335-84dc-3a3e7e138d5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1116f56-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:0b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428661, 'reachable_time': 42876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226119, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.032 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.035 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.049 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[222cf2ee-ed49-48c1-82fa-8aacb934e6b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.076 2 DEBUG nova.compute.manager [req-e36ce2a1-575d-46e0-acef-fcfd7798f4c7 req-cd1a6327-d2de-4bc1-befa-0538afbc75a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.076 2 DEBUG oslo_concurrency.lockutils [req-e36ce2a1-575d-46e0-acef-fcfd7798f4c7 req-cd1a6327-d2de-4bc1-befa-0538afbc75a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.077 2 DEBUG oslo_concurrency.lockutils [req-e36ce2a1-575d-46e0-acef-fcfd7798f4c7 req-cd1a6327-d2de-4bc1-befa-0538afbc75a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.077 2 DEBUG oslo_concurrency.lockutils [req-e36ce2a1-575d-46e0-acef-fcfd7798f4c7 req-cd1a6327-d2de-4bc1-befa-0538afbc75a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.077 2 DEBUG nova.compute.manager [req-e36ce2a1-575d-46e0-acef-fcfd7798f4c7 req-cd1a6327-d2de-4bc1-befa-0538afbc75a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Processing event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.078 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.081 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.083 2 INFO nova.virt.libvirt.driver [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance spawned successfully.#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.083 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.090 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.091 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556389.0806777, 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.091 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.103 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f2767b07-9e37-415c-a30d-f8ce63752f81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.105 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1116f56-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.105 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.105 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1116f56-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:49 np0005470441 NetworkManager[51690]: <info>  [1759556389.1075] manager: (tapf1116f56-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct  4 01:39:49 np0005470441 kernel: tapf1116f56-90: entered promiscuous mode
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.110 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1116f56-90, col_values=(('external_ids', {'iface-id': '97bc4ce0-5f5c-4023-9415-069692deb3ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.111 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.113 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f1116f56-9520-48d8-8bb2-2519f97b3338.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f1116f56-9520-48d8-8bb2-2519f97b3338.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.114 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1a20976f-9756-4ccc-9a91-e6706a1172d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.114 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-f1116f56-9520-48d8-8bb2-2519f97b3338
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/f1116f56-9520-48d8-8bb2-2519f97b3338.pid.haproxy
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID f1116f56-9520-48d8-8bb2-2519f97b3338
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:39:49 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:49.115 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'env', 'PROCESS_TAG=haproxy-f1116f56-9520-48d8-8bb2-2519f97b3338', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f1116f56-9520-48d8-8bb2-2519f97b3338.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.114 2 INFO nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Took 8.96 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:39:49 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:49Z|00210|binding|INFO|Releasing lport 97bc4ce0-5f5c-4023-9415-069692deb3ec from this chassis (sb_readonly=0)
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.116 2 DEBUG nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.124 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.124 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.125 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.125 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.126 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.127 2 DEBUG nova.virt.libvirt.driver [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.132 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.180 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.222 2 INFO nova.compute.manager [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Took 9.87 seconds to build instance.#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.229 2 INFO nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Took 9.40 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.230 2 DEBUG nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.241 2 DEBUG oslo_concurrency.lockutils [None req-ace2f974-3852-41b5-9342-8eaa66f712f8 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.290 2 INFO nova.compute.manager [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Took 9.98 seconds to build instance.#033[00m
Oct  4 01:39:49 np0005470441 nova_compute[192626]: 2025-10-04 05:39:49.311 2 DEBUG oslo_concurrency.lockutils [None req-75a677f8-0347-4f1c-9086-5403583b8fbd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:49 np0005470441 podman[226151]: 2025-10-04 05:39:49.464396706 +0000 UTC m=+0.047599684 container create 4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  4 01:39:49 np0005470441 systemd[1]: Started libpod-conmon-4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07.scope.
Oct  4 01:39:49 np0005470441 podman[226151]: 2025-10-04 05:39:49.439157298 +0000 UTC m=+0.022360296 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:39:49 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:39:49 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92e9c9aa52db566a69381c205bfff2ce7d8a9da7e1dd113e8de7e02800f7ca0e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:39:49 np0005470441 podman[226151]: 2025-10-04 05:39:49.555291829 +0000 UTC m=+0.138494807 container init 4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:39:49 np0005470441 podman[226151]: 2025-10-04 05:39:49.56235839 +0000 UTC m=+0.145561358 container start 4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0)
Oct  4 01:39:49 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [NOTICE]   (226170) : New worker (226172) forked
Oct  4 01:39:49 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [NOTICE]   (226170) : Loading success.
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.411 2 DEBUG nova.network.neutron [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Updated VIF entry in instance network info cache for port 1c32865e-e189-4f96-b7d5-f3c3a5136407. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.411 2 DEBUG nova.network.neutron [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Updating instance_info_cache with network_info: [{"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.430 2 DEBUG oslo_concurrency.lockutils [req-5ff7365b-65e1-45d5-a6b0-65ddf3669a37 req-0d033c4f-518e-4f9d-883b-84fe569f9f05 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.623 2 DEBUG nova.network.neutron [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updated VIF entry in instance network info cache for port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.624 2 DEBUG nova.network.neutron [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updating instance_info_cache with network_info: [{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:50 np0005470441 nova_compute[192626]: 2025-10-04 05:39:50.648 2 DEBUG oslo_concurrency.lockutils [req-ab1e0b8f-acaa-4136-8caa-25e994a0afea req-4eb27b22-1597-48c0-ac77-f72fe4c63e8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.269 2 DEBUG nova.compute.manager [req-e0289ed1-a83b-47ae-98e6-94a6f8dd0140 req-7dc67107-176f-4820-b187-6e0445df194a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.270 2 DEBUG oslo_concurrency.lockutils [req-e0289ed1-a83b-47ae-98e6-94a6f8dd0140 req-7dc67107-176f-4820-b187-6e0445df194a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.270 2 DEBUG oslo_concurrency.lockutils [req-e0289ed1-a83b-47ae-98e6-94a6f8dd0140 req-7dc67107-176f-4820-b187-6e0445df194a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.270 2 DEBUG oslo_concurrency.lockutils [req-e0289ed1-a83b-47ae-98e6-94a6f8dd0140 req-7dc67107-176f-4820-b187-6e0445df194a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.270 2 DEBUG nova.compute.manager [req-e0289ed1-a83b-47ae-98e6-94a6f8dd0140 req-7dc67107-176f-4820-b187-6e0445df194a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] No waiting events found dispatching network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.271 2 WARNING nova.compute.manager [req-e0289ed1-a83b-47ae-98e6-94a6f8dd0140 req-7dc67107-176f-4820-b187-6e0445df194a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received unexpected event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.345 2 DEBUG nova.compute.manager [req-8f40ef4f-c92a-45e2-b594-a1748b797d28 req-95fefec6-acea-48dc-a1d3-0959711a3f7d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.345 2 DEBUG oslo_concurrency.lockutils [req-8f40ef4f-c92a-45e2-b594-a1748b797d28 req-95fefec6-acea-48dc-a1d3-0959711a3f7d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.345 2 DEBUG oslo_concurrency.lockutils [req-8f40ef4f-c92a-45e2-b594-a1748b797d28 req-95fefec6-acea-48dc-a1d3-0959711a3f7d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.345 2 DEBUG oslo_concurrency.lockutils [req-8f40ef4f-c92a-45e2-b594-a1748b797d28 req-95fefec6-acea-48dc-a1d3-0959711a3f7d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.345 2 DEBUG nova.compute.manager [req-8f40ef4f-c92a-45e2-b594-a1748b797d28 req-95fefec6-acea-48dc-a1d3-0959711a3f7d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] No waiting events found dispatching network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:39:51 np0005470441 nova_compute[192626]: 2025-10-04 05:39:51.346 2 WARNING nova.compute.manager [req-8f40ef4f-c92a-45e2-b594-a1748b797d28 req-95fefec6-acea-48dc-a1d3-0959711a3f7d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received unexpected event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:39:52 np0005470441 nova_compute[192626]: 2025-10-04 05:39:52.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:53 np0005470441 podman[226184]: 2025-10-04 05:39:53.305555255 +0000 UTC m=+0.055399066 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:39:53 np0005470441 podman[226183]: 2025-10-04 05:39:53.309792465 +0000 UTC m=+0.059633896 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 01:39:53 np0005470441 nova_compute[192626]: 2025-10-04 05:39:53.369 2 DEBUG nova.compute.manager [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-changed-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:53 np0005470441 nova_compute[192626]: 2025-10-04 05:39:53.369 2 DEBUG nova.compute.manager [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Refreshing instance network info cache due to event network-changed-1c32865e-e189-4f96-b7d5-f3c3a5136407. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:39:53 np0005470441 nova_compute[192626]: 2025-10-04 05:39:53.369 2 DEBUG oslo_concurrency.lockutils [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:53 np0005470441 nova_compute[192626]: 2025-10-04 05:39:53.370 2 DEBUG oslo_concurrency.lockutils [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:53 np0005470441 nova_compute[192626]: 2025-10-04 05:39:53.370 2 DEBUG nova.network.neutron [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Refreshing network info cache for port 1c32865e-e189-4f96-b7d5-f3c3a5136407 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.707 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.709 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.709 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.709 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.710 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.711 2 INFO nova.compute.manager [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Terminating instance#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.712 2 DEBUG nova.compute.manager [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:39:54 np0005470441 kernel: tapd58c8be4-d6 (unregistering): left promiscuous mode
Oct  4 01:39:54 np0005470441 NetworkManager[51690]: <info>  [1759556394.7417] device (tapd58c8be4-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:39:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:54Z|00211|binding|INFO|Releasing lport d58c8be4-d665-45d0-b948-8ce2d9d5fee9 from this chassis (sb_readonly=0)
Oct  4 01:39:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:54Z|00212|binding|INFO|Setting lport d58c8be4-d665-45d0-b948-8ce2d9d5fee9 down in Southbound
Oct  4 01:39:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:39:54Z|00213|binding|INFO|Removing iface tapd58c8be4-d6 ovn-installed in OVS
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:54.788 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:f3:b9 10.100.0.12'], port_security=['fa:16:3e:f3:f3:b9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '9e639d23-2cac-4fb4-a915-d88dfa03aad4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2eaa5fc2c08b415c8c98103e044fc0a3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a701a32e-22f0-4c4c-9e93-6922edca50cc a986649d-a7b9-49d5-bf85-6fdc469b0305', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b622aac-753c-4b0e-8346-ec91ed2d23cf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=d58c8be4-d665-45d0-b948-8ce2d9d5fee9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:39:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:54.789 103689 INFO neutron.agent.ovn.metadata.agent [-] Port d58c8be4-d665-45d0-b948-8ce2d9d5fee9 in datapath 9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b unbound from our chassis#033[00m
Oct  4 01:39:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:54.791 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:54.794 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f106a1-4a51-4779-a631-85270a7beaba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:54.794 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b namespace which is not needed anymore#033[00m
Oct  4 01:39:54 np0005470441 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct  4 01:39:54 np0005470441 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000019.scope: Consumed 15.249s CPU time.
Oct  4 01:39:54 np0005470441 systemd-machined[152624]: Machine qemu-13-instance-00000019 terminated.
Oct  4 01:39:54 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [NOTICE]   (225583) : haproxy version is 2.8.14-c23fe91
Oct  4 01:39:54 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [NOTICE]   (225583) : path to executable is /usr/sbin/haproxy
Oct  4 01:39:54 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [WARNING]  (225583) : Exiting Master process...
Oct  4 01:39:54 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [WARNING]  (225583) : Exiting Master process...
Oct  4 01:39:54 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [ALERT]    (225583) : Current worker (225585) exited with code 143 (Terminated)
Oct  4 01:39:54 np0005470441 neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b[225579]: [WARNING]  (225583) : All workers exited. Exiting... (0)
Oct  4 01:39:54 np0005470441 systemd[1]: libpod-7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42.scope: Deactivated successfully.
Oct  4 01:39:54 np0005470441 podman[226244]: 2025-10-04 05:39:54.926753684 +0000 UTC m=+0.043591800 container died 7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:39:54 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42-userdata-shm.mount: Deactivated successfully.
Oct  4 01:39:54 np0005470441 systemd[1]: var-lib-containers-storage-overlay-6f56e16f94837edda1da7c908d42f5fc1b5c645731011d174a21a6a7ab3ebc8b-merged.mount: Deactivated successfully.
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.981 2 INFO nova.virt.libvirt.driver [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Instance destroyed successfully.#033[00m
Oct  4 01:39:54 np0005470441 nova_compute[192626]: 2025-10-04 05:39:54.982 2 DEBUG nova.objects.instance [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lazy-loading 'resources' on Instance uuid 9e639d23-2cac-4fb4-a915-d88dfa03aad4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:39:54 np0005470441 podman[226244]: 2025-10-04 05:39:54.985765742 +0000 UTC m=+0.102603858 container cleanup 7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:39:54 np0005470441 systemd[1]: libpod-conmon-7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42.scope: Deactivated successfully.
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.007 2 DEBUG nova.virt.libvirt.vif [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:38:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1075539829-access_point-742791344',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1075539829-ac',id=25,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJFqQvtFxd75j6M1dZnin3U3qhTzAMmgTO1toi3QXem/FknyiLDhCX9pbpgkbr0swLCPgRJrDUzU7KfrXpVK4pE4ajoILoN12c3kcYw8ytmq93tr9gi0XeGfb5H6w9IdPQ==',key_name='tempest-TestSecurityGroupsBasicOps-1645631158',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:38:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2eaa5fc2c08b415c8c98103e044fc0a3',ramdisk_id='',reservation_id='r-w0eny6s7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1075539829',owner_user_name='tempest-TestSecurityGroupsBasicOps-1075539829-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:38:53Z,user_data=None,user_id='560c2ee221db4d87b04080584e8f0a48',uuid=9e639d23-2cac-4fb4-a915-d88dfa03aad4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.007 2 DEBUG nova.network.os_vif_util [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converting VIF {"id": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "address": "fa:16:3e:f3:f3:b9", "network": {"id": "9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b", "bridge": "br-int", "label": "tempest-network-smoke--2001379221", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2eaa5fc2c08b415c8c98103e044fc0a3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd58c8be4-d6", "ovs_interfaceid": "d58c8be4-d665-45d0-b948-8ce2d9d5fee9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.008 2 DEBUG nova.network.os_vif_util [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.008 2 DEBUG os_vif [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.010 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd58c8be4-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.015 2 INFO os_vif [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:f3:b9,bridge_name='br-int',has_traffic_filtering=True,id=d58c8be4-d665-45d0-b948-8ce2d9d5fee9,network=Network(9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd58c8be4-d6')#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.016 2 INFO nova.virt.libvirt.driver [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Deleting instance files /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4_del#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.016 2 INFO nova.virt.libvirt.driver [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Deletion of /var/lib/nova/instances/9e639d23-2cac-4fb4-a915-d88dfa03aad4_del complete#033[00m
Oct  4 01:39:55 np0005470441 podman[226288]: 2025-10-04 05:39:55.050358358 +0000 UTC m=+0.041719617 container remove 7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.056 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4181e5c7-1fa1-41ba-823f-7f960059e743]: (4, ('Sat Oct  4 05:39:54 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b (7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42)\n7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42\nSat Oct  4 05:39:54 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b (7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42)\n7d8773c07a092316023e80178274b1b438a0cfda1342c807f447f055e543fd42\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.058 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5173e0b2-5133-4a6e-a729-c34fa0e7f6b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.059 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9656f2f4-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:39:55 np0005470441 kernel: tap9656f2f4-20: left promiscuous mode
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.080 2 INFO nova.compute.manager [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.080 2 DEBUG oslo.service.loopingcall [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.081 2 DEBUG nova.compute.manager [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.081 2 DEBUG nova.network.neutron [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.081 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc26e92-7293-4d58-ab19-94bf63fb276d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.104 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a7de8e77-b0eb-45bd-9492-432705ce7687]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.106 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[40ebcebe-f310-44ba-96f6-a5270eb2b704]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.124 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[105d8f98-cc4e-4164-8550-8b14a4ea7d69]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 423033, 'reachable_time': 19631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226303, 'error': None, 'target': 'ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.127 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9656f2f4-2cc7-490c-a6b2-3b5ee2efa72b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:39:55 np0005470441 systemd[1]: run-netns-ovnmeta\x2d9656f2f4\x2d2cc7\x2d490c\x2da6b2\x2d3b5ee2efa72b.mount: Deactivated successfully.
Oct  4 01:39:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:39:55.128 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[6793292a-e62b-47f6-886d-b807c4885bf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.605 2 DEBUG nova.compute.manager [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-changed-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.607 2 DEBUG nova.compute.manager [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Refreshing instance network info cache due to event network-changed-d58c8be4-d665-45d0-b948-8ce2d9d5fee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.607 2 DEBUG oslo_concurrency.lockutils [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.607 2 DEBUG oslo_concurrency.lockutils [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.608 2 DEBUG nova.network.neutron [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Refreshing network info cache for port d58c8be4-d665-45d0-b948-8ce2d9d5fee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.699 2 DEBUG nova.compute.manager [req-3c4d3ecd-f479-4e0f-b94b-e28f46937874 req-9b32983e-3561-4c62-9166-67a1987e5a26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-vif-unplugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.700 2 DEBUG oslo_concurrency.lockutils [req-3c4d3ecd-f479-4e0f-b94b-e28f46937874 req-9b32983e-3561-4c62-9166-67a1987e5a26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.700 2 DEBUG oslo_concurrency.lockutils [req-3c4d3ecd-f479-4e0f-b94b-e28f46937874 req-9b32983e-3561-4c62-9166-67a1987e5a26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.700 2 DEBUG oslo_concurrency.lockutils [req-3c4d3ecd-f479-4e0f-b94b-e28f46937874 req-9b32983e-3561-4c62-9166-67a1987e5a26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.700 2 DEBUG nova.compute.manager [req-3c4d3ecd-f479-4e0f-b94b-e28f46937874 req-9b32983e-3561-4c62-9166-67a1987e5a26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] No waiting events found dispatching network-vif-unplugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:39:55 np0005470441 nova_compute[192626]: 2025-10-04 05:39:55.701 2 DEBUG nova.compute.manager [req-3c4d3ecd-f479-4e0f-b94b-e28f46937874 req-9b32983e-3561-4c62-9166-67a1987e5a26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-vif-unplugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.163 2 DEBUG nova.network.neutron [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.204 2 INFO nova.compute.manager [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Took 1.12 seconds to deallocate network for instance.#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.225 2 INFO nova.network.neutron [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Port d58c8be4-d665-45d0-b948-8ce2d9d5fee9 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.226 2 DEBUG nova.network.neutron [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.259 2 DEBUG nova.network.neutron [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Updated VIF entry in instance network info cache for port 1c32865e-e189-4f96-b7d5-f3c3a5136407. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.260 2 DEBUG nova.network.neutron [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Updating instance_info_cache with network_info: [{"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.266 2 DEBUG oslo_concurrency.lockutils [req-dc797d35-ae03-43db-8ab3-528224b2fe03 req-a0663897-dc9a-4b27-b862-eb04dc4365be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-9e639d23-2cac-4fb4-a915-d88dfa03aad4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.274 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.274 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.298 2 DEBUG oslo_concurrency.lockutils [req-7591a97a-0760-4e81-b36a-d8bcb86baa29 req-87d02d2c-d15d-464b-9995-d6fb0bcf6310 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-369c5da6-9c6d-48e7-a402-88f996ed8276" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.586 2 DEBUG nova.compute.provider_tree [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.603 2 DEBUG nova.scheduler.client.report [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.624 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.658 2 INFO nova.scheduler.client.report [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Deleted allocations for instance 9e639d23-2cac-4fb4-a915-d88dfa03aad4#033[00m
Oct  4 01:39:56 np0005470441 nova_compute[192626]: 2025-10-04 05:39:56.735 2 DEBUG oslo_concurrency.lockutils [None req-be430cd8-51c6-45ce-b439-864b4de042dc 560c2ee221db4d87b04080584e8f0a48 2eaa5fc2c08b415c8c98103e044fc0a3 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.026s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.698 2 DEBUG nova.compute.manager [req-d4177d03-77c7-4349-9684-d40cce65af36 req-61a3b069-1d90-43a6-a95b-9dba77d30559 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-vif-deleted-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.837 2 DEBUG nova.compute.manager [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.838 2 DEBUG oslo_concurrency.lockutils [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.838 2 DEBUG oslo_concurrency.lockutils [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.839 2 DEBUG oslo_concurrency.lockutils [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9e639d23-2cac-4fb4-a915-d88dfa03aad4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.839 2 DEBUG nova.compute.manager [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] No waiting events found dispatching network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.839 2 WARNING nova.compute.manager [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Received unexpected event network-vif-plugged-d58c8be4-d665-45d0-b948-8ce2d9d5fee9 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.839 2 DEBUG nova.compute.manager [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-changed-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.840 2 DEBUG nova.compute.manager [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Refreshing instance network info cache due to event network-changed-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.840 2 DEBUG oslo_concurrency.lockutils [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.840 2 DEBUG oslo_concurrency.lockutils [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:39:57 np0005470441 nova_compute[192626]: 2025-10-04 05:39:57.841 2 DEBUG nova.network.neutron [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Refreshing network info cache for port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:40:00 np0005470441 nova_compute[192626]: 2025-10-04 05:40:00.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:00 np0005470441 nova_compute[192626]: 2025-10-04 05:40:00.278 2 DEBUG nova.network.neutron [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updated VIF entry in instance network info cache for port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:40:00 np0005470441 nova_compute[192626]: 2025-10-04 05:40:00.279 2 DEBUG nova.network.neutron [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updating instance_info_cache with network_info: [{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:00 np0005470441 nova_compute[192626]: 2025-10-04 05:40:00.303 2 DEBUG oslo_concurrency.lockutils [req-fa00d9b2-e584-48a7-b2cf-e48ec4ec8a5f req-d89d4f28-71db-4998-94d0-3fa6d4230b1a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:40:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:00Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:94:6c:37 10.100.0.8
Oct  4 01:40:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:00Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:94:6c:37 10.100.0.8
Oct  4 01:40:01 np0005470441 podman[226329]: 2025-10-04 05:40:01.31215096 +0000 UTC m=+0.055706594 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Oct  4 01:40:01 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:01Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:44:b1:43 10.100.0.8
Oct  4 01:40:01 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:01Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:44:b1:43 10.100.0.8
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.746 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.746 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.747 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.827 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.893 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.895 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.970 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:01 np0005470441 nova_compute[192626]: 2025-10-04 05:40:01.976 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.047 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.049 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.108 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.257 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.258 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5377MB free_disk=73.40815353393555GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.259 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.259 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.342 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.343 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 369c5da6-9c6d-48e7-a402-88f996ed8276 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.343 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.344 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.408 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:40:02 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:02Z|00214|binding|INFO|Releasing lport ac619930-3558-40f2-b142-298dd722addb from this chassis (sb_readonly=0)
Oct  4 01:40:02 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:02Z|00215|binding|INFO|Releasing lport 97bc4ce0-5f5c-4023-9415-069692deb3ec from this chassis (sb_readonly=0)
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.427 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.452 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.453 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:02 np0005470441 nova_compute[192626]: 2025-10-04 05:40:02.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.711 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'name': 'tempest-TestNetworkBasicOps-server-971816448', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001d', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7ec39d6d697445438e79b0bfc666a027', 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'hostId': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.713 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001c', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd0c087ea0f62444e80490916b42c760f', 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'hostId': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.714 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.731 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.read.requests volume: 1070 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.732 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.750 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.read.requests volume: 1131 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.750 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89dddda6-b16b-496a-a639-a4b6cb371bba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1070, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.714124', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92f9e606-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '9fd91405fe82a5f99b9d0d13f19d88ac443868a6d20c5aed078290ea28f15a81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.714124', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92f9f402-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '762212ef900b5c602008fabf3a03ad20082f566fe69528d3bfec3e2032a97b38'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1131, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.714124', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92fcacb0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '0d6d5099d878e28bdd73e252be0155ff2499d3e2624fb823ef698f63813049ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.714124', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92fcb700-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': 'f7a352b60b1ffee3e6e71eefea68c81a336b6483899e995fd0ae31ea801e27a3'}]}, 'timestamp': '2025-10-04 05:40:02.750969', '_unique_id': '3ac7e35065e64165bc6570f34560fd99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.752 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>]
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.write.latency volume: 1894514398 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.753 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.write.latency volume: 3648046745 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9ba9ed9-6c99-4392-9644-6ff00fb3de00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1894514398, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.753426', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92fd2136-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '2acbb963973b60d66a01d95be44f1595557be625bd135f8f7317c54725d886b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.753426', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92fd29b0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '53d1d9e7f977c8583fdac0d3edc504047f258d45a84ab555242538da62e82531'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3648046745, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.753426', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92fd3158-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '69b8544a4be4504c6ebaf0ea6c5c146d11a0800598d91fb7bba4573871e9daaf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.753426', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92fd38ba-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': 'd01e676de0caac12a38b0d71c1cfa2fa411c29fb8ae4cd5cc23dd0cc7d77ff9e'}]}, 'timestamp': '2025-10-04 05:40:02.754282', '_unique_id': '41ed7c8a742b49da94303ca78e708dcc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.754 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.755 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.755 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.755 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92ac9d5-dc5e-4665-98b4-a4a12b4f9ae1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.755474', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92fd7140-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': 'f6cb4b40b427a088e85176658b895aafd7ff767fe5b4caa7193b96d67b878fdb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.755474', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92fd78f2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '1861623b392d64868128c9bae9c4f113b1cdab83d839754c112b0c62c8c91371'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.755474', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '92fd80cc-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '33a2e65bebfe7295a5d64064b3c117bb5c89d81ca30df616cf759b12a28f6430'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.755474', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '92fd882e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': 'bf21c7d212bfc00259f43b9702b37898488388fa1a7f6a654bd8e8a5d4f6ce0b'}]}, 'timestamp': '2025-10-04 05:40:02.756326', '_unique_id': '4d34370caed24a0099bafb3215f229c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.756 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.757 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.774 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.774 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.783 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.783 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '287d7aec-8850-4cea-8527-b3f9d482a075', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.757648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93004ffa-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.472206819, 'message_signature': 'd22349f45495daaf05b6087dc4f4ef5354b1fea2d206f4235b84046bcca25e1f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.757648', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93005b76-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.472206819, 'message_signature': 'c9fe5f3dff37289a0d36a2e5ad9a162672b7f5a4dd9f47a972a47d07440dde9a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.757648', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '9301b084-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.489352367, 'message_signature': '89efb7d30ef0867bc74b31f604d7a01d63ec9b832f22f242d68d7aeb383a7460'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.757648', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9301b99e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.489352367, 'message_signature': '5ea51191093ffe26fff5d6a56bb82b7ef10b852828ff25ab6a7f5c41b3c23e23'}]}, 'timestamp': '2025-10-04 05:40:02.783787', '_unique_id': '4c0ed637f14449f0b36e3630cc0e3e7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.784 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.787 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 369c5da6-9c6d-48e7-a402-88f996ed8276 / tap1c32865e-e1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.788 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.790 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 / tapea5d5ce9-b8 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.790 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '712bd5d2-c008-40e2-ba99-c955893e02db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.785437', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930267f4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': 'c68004f6e04aec8dc105565fc533a23ade22572e338fba5451b423207ac56b37'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.785437', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '9302d180-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': 'a87d46a8a88aee7f0e42dad5947616d3936566cbd3fed595f594b6cdd0821f19'}]}, 'timestamp': '2025-10-04 05:40:02.791027', '_unique_id': '8a09e2b4cf8348349a22ce1b68ac8cf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.792 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.793 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.read.latency volume: 612857198 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.793 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.read.latency volume: 77962763 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.793 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.read.latency volume: 592659637 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.793 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.read.latency volume: 49869981 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cc8db86-bab4-4f37-af68-e6aba4794171', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 612857198, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.793054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93032cd4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': 'b37c93a37ea1669b85c604fda549f177f2fb917fdc20f607d14e5826265645e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 77962763, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.793054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93033530-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '993ca7686786fbbe27566629d9425caa6a2a12c5c0d4ac0c00812e2f16b85039'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 592659637, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.793054', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93033f1c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '2c11fdeec93fcd03d76e42585b918a70d99a763787168fab0c3806ffd01e8cda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 49869981, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.793054', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '9303470a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '743d4cd55115e117660fbe25f6d06da1a14908a92a77839fe40c5fdd60346714'}]}, 'timestamp': '2025-10-04 05:40:02.793953', '_unique_id': 'b1d56af9435e48bba5c7a136d22ab511'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.794 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.795 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.809 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/cpu volume: 11350000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.823 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/cpu volume: 11720000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6692f2f-d1ef-444a-a53f-6558c9c3a537', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11350000000, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'timestamp': '2025-10-04T05:40:02.795157', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9305c49e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.52431397, 'message_signature': '5a807ee94d1bb2e5283c37ed80fc77065f7046d08873e3cb99704bd64a41235e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11720000000, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'timestamp': '2025-10-04T05:40:02.795157', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '9307d752-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.537958938, 'message_signature': '26eb63a8aff1e0d4d5c7f28865e9f6953069c87c9bdedaf7a1c5c43bb7782c86'}]}, 'timestamp': '2025-10-04 05:40:02.823944', '_unique_id': '4501139055da44b7822e06a3dac49cdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.825 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.826 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.826 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.826 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.826 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.827 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '743bb883-bbb0-4d52-adaf-13b59511e752', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.826124', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93084188-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.472206819, 'message_signature': '4da6c1a0c72b64d67d839f1607d8c911a0058165128a41656ea0d5adeb2968e4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.826124', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93084e76-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.472206819, 'message_signature': 'a3ed82c598d20669b13159f73c334e88e1a63712525ef68102ebcb4271f2c592'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.826124', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93085998-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.489352367, 'message_signature': 'fe2e4d65f35b12cff65c976ef2f114bd2f9a811335b172079b645432d297a006'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.826124', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93086c8a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.489352367, 'message_signature': '881094ed9227513bc5ef38df7d392cc603011d1b15402f085f5261d3b2e37b01'}]}, 'timestamp': '2025-10-04 05:40:02.827723', '_unique_id': 'ea1f61e3f6054045991c95d6a07b9412'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.828 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.829 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.829 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06f7c5ab-f997-4cf9-acb1-73ae65508462', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.829405', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '9308bb86-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': 'eb2e5ee41f9d718eb8506638283c4226c3d2e17bfcde5f7cfa625561bfa7322d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.829405', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '9308c838-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': '3b4926ea2cc9848e1e4ef47490f615ddd4336788111e57c1c8e4617636510aaf'}]}, 'timestamp': '2025-10-04 05:40:02.830083', '_unique_id': '4c51488b5d664dffa99b8597f3f8dce1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.830 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.831 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.831 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.832 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72ea09b4-472e-4c37-ba7d-f75f6112fbf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.831708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '93091414-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': '5948c3efda3cebd0fffa1e288bb7397ce42741f1f82a60f0de039317b3dcdf36'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.831708', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930922c4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': 'a9472045177c8cc76da99d615a96c1d1c0e378ebcaa0ec1eb1bc0f64ac5bfc2f'}]}, 'timestamp': '2025-10-04 05:40:02.832412', '_unique_id': 'b1e3a0f90f8545f9a19a609440083706'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.833 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.834 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.834 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.read.bytes volume: 29616640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.834 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.834 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.read.bytes volume: 31001088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.835 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '269b9cc6-d2f5-4dac-9489-dfa7bbb81b18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29616640, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.834209', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930975ee-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '5250e1cbfb495a598b2764362f28ff1b24be38b89195bceca7b21f0b2eb0f364'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.834209', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93098322-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': 'e09e14185f40a5db19154e53500a56edc01bb1c4c45d9f6d13191ee211dfdec6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31001088, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.834209', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '93099114-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '5e23520db31ce82dcd18f43cf1f23a43492aec3ab00b27078d0834b3e9283cca'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.834209', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '93099c2c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '1e323d6f617b8974aaf00413bbed8360d121349a49b72fe6714c24a59f764de1'}]}, 'timestamp': '2025-10-04 05:40:02.835496', '_unique_id': '18b75af094f64c2d92ae5962ec0bb4ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.836 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.837 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.837 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>]
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.837 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.837 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.838 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d9d1694-ff71-4a15-9426-59bc840329a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.837815', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930a034c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': '743b9fa55e895e3d41c6f9c13c456136d244633f457bd044d36399512b86af71'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.837815', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930a10a8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': 'aae7958e9bfddd375d4e4a16eb1d6f9b0698fc950d70fdcbe6f3eeeba858dbf2'}]}, 'timestamp': '2025-10-04 05:40:02.838540', '_unique_id': 'b38d7c13c8be478fbae433d9986fede7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.839 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.840 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.840 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>]
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.840 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.841 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15f94fec-7e07-4b70-8318-7dbe8d10ed4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.840792', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930a77dc-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': '911370bf09617eb08017dc74af78802c964bf48757e70bde13f7cf879983d245'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.840792', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930a83f8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': 'dc3dfab4d152a7b8a3d6262180447bbad70899b43353b2c8c453cfc1f2915a0e'}]}, 'timestamp': '2025-10-04 05:40:02.841441', '_unique_id': '1ff0bcd977c9487880c1bf169dd9d380'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.843 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.843 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.843 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b68054e-800e-4300-a48b-15397db71918', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.843213', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930ad5d8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': '3fb37961ac5e85ee1075f2386b0977de9dd3ea3598048fcb4ac64673452dfcee'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.843213', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930ae3a2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': '2c2b7ca338cf53f43b8013bf047ac66e912a948cdd3af9e22192b6b5e71ebeab'}]}, 'timestamp': '2025-10-04 05:40:02.843889', '_unique_id': '756c2ba540f1440bbce39571cc049d89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.844 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.845 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.845 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.846 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/memory.usage volume: 40.375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10862662-e404-480c-831f-5e1c5b874899', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'timestamp': '2025-10-04T05:40:02.845778', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '930b39d8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.52431397, 'message_signature': 'f96dbfed16d751bf2448da3628b410d8a62ac54bc0e489535788f96ac4e7657f'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.375, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'timestamp': '2025-10-04T05:40:02.845778', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '930b4554-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.537958938, 'message_signature': '5b0a1fb71ee121a74c91d95805bfe946cf0a290c8234f8b73053c3da4476e99b'}]}, 'timestamp': '2025-10-04 05:40:02.846380', '_unique_id': '5c2bc6caa12e4394873050e9f647fd10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.847 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.848 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.848 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.write.requests volume: 303 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.848 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.849 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.write.requests volume: 324 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.849 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '101b9a7d-89f9-4506-8350-b2f0af2ebf79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 303, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.848311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930b9ca2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '3b4b90e5d57e6a7c843905e81dd5b298b92778e85117107a16f9b13446c158f5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.848311', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '930baa58-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.428671372, 'message_signature': '9b8da55d1626e982b936174390edf0ae20258cf0d66326e9cd06cbdbe4e63fc4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 324, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.848311', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930bb7f0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '55045ba745a76d663e9a3d6bd940c055d8310a11216b6f9ca8e87250873ff5a2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.848311', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '930bc2cc-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.447404204, 'message_signature': '91c3c87beafcb94d0547cbf829c190bf7d011d81331afaa0c3601e62b0516a8e'}]}, 'timestamp': '2025-10-04 05:40:02.849617', '_unique_id': 'a8a2c44a4c6c4d1eaa4621e2a9789e64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.851 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.851 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.851 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.851 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.852 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bf608df-6b99-4fd4-8650-99fb34356be2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-vda', 'timestamp': '2025-10-04T05:40:02.851304', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930c1452-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.472206819, 'message_signature': 'a7706e5147269b4bcc25b204eb0f22084c83ca4e1edd998e62fee71388ec3421'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '369c5da6-9c6d-48e7-a402-88f996ed8276-sda', 'timestamp': '2025-10-04T05:40:02.851304', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'instance-0000001d', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '930c201e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.472206819, 'message_signature': 'cb1875b06f908454ce2e370e7351b307fb68c9338d721d8f5ce06f53f0c4b020'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-vda', 'timestamp': '2025-10-04T05:40:02.851304', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '930c2ac8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.489352367, 'message_signature': '9fe9ad8669bd3d4a820bc6056fbcc50c0fb3e9b0f7df000f0d08ea0de7556382'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-sda', 'timestamp': '2025-10-04T05:40:02.851304', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'instance-0000001c', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '930c3554-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.489352367, 'message_signature': 'da2af49c77cf2d4f65c8bd9a5bbd9e2c1a7d122d5d745cf9ecede8bf1c0eba15'}]}, 'timestamp': '2025-10-04 05:40:02.852602', '_unique_id': 'd7e83ce93dc24f38b51b2124718e9358'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.853 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.854 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.854 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.854 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8f58e41-e2ef-4261-899c-ad5f9ceb41ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.854445', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930c90b2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': 'b02266b494951712f96e04c2a51fb5c19b8793ccd67065774718f10cdf1ea7e3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.854445', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930ca0c0-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': '6ae84d6a1bbe795653875fe362a8b4ac449956799b5a685c84323a97f40b3960'}]}, 'timestamp': '2025-10-04 05:40:02.855292', '_unique_id': '3499c6693a4344188b51d5b180e4dbee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.856 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.857 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.857 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.857 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-971816448>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-506251172>]
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.857 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.858 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe51a933-1c28-42a6-a564-8d8c926c05f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.857689', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930d0d12-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': 'bab5350a8c27404e6ec950684921b0fcbddc6e138af895bde83951d612e79a7e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.857689', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930d1a14-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': 'c077a7f3707810be842397561c65a01ea077a00511af683b54178a5077d38d3b'}]}, 'timestamp': '2025-10-04 05:40:02.858420', '_unique_id': '93fdb1112cc94ad0b417d554bfd66435'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.859 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.860 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.860 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.outgoing.bytes volume: 1242 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.860 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.outgoing.bytes volume: 1284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d71e0e8-d249-49c1-8e9b-9bb49e5f21ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1242, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.860255', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930d6f3c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': '515b266e7d553b2dfe7337b99b82040054244189d81a9162299b243431affc91'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1284, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.860255', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930d7e50-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': '60d40344d6d9f5a1ed926e9adaf2416d82898a2d3bbd573137e908812dad6a9b'}]}, 'timestamp': '2025-10-04 05:40:02.861010', '_unique_id': 'b2f1989001744cc684a5a864b66c0983'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.861 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.862 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.862 12 DEBUG ceilometer.compute.pollsters [-] 369c5da6-9c6d-48e7-a402-88f996ed8276/network.incoming.bytes volume: 2044 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.863 12 DEBUG ceilometer.compute.pollsters [-] 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/network.incoming.bytes volume: 1848 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43718fcc-3cc4-40f0-b634-880b485804e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2044, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001d-369c5da6-9c6d-48e7-a402-88f996ed8276-tap1c32865e-e1', 'timestamp': '2025-10-04T05:40:02.862688', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-971816448', 'name': 'tap1c32865e-e1', 'instance_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:44:b1:43', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1c32865e-e1'}, 'message_id': '930dce46-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.499980129, 'message_signature': '56ea5f46ae831be3d7defd8134166156f6ed90ec6d29f5438a6589d1a44a88ca'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1848, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_name': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_name': None, 'resource_id': 'instance-0000001c-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-tapea5d5ce9-b8', 'timestamp': '2025-10-04T05:40:02.862688', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-506251172', 'name': 'tapea5d5ce9-b8', 'instance_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'instance_type': 'm1.nano', 'host': '16c72042d5b0461e4626824461dca7600d70b95b92edf7a636392f6c', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:94:6c:37', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapea5d5ce9-b8'}, 'message_id': '930ddb5c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4300.502782479, 'message_signature': '6cb44a8fa12d97b6b0b36972d8b26185bb78f3690c6c97997ea32cdb06ccac30'}]}, 'timestamp': '2025-10-04 05:40:02.863358', '_unique_id': 'ce822dc13c684b47b72c3a552de00d56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:40:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:40:02.864 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:40:03 np0005470441 nova_compute[192626]: 2025-10-04 05:40:03.455 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:03 np0005470441 nova_compute[192626]: 2025-10-04 05:40:03.455 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:40:03 np0005470441 nova_compute[192626]: 2025-10-04 05:40:03.455 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:40:04 np0005470441 nova_compute[192626]: 2025-10-04 05:40:04.062 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:40:04 np0005470441 nova_compute[192626]: 2025-10-04 05:40:04.063 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:40:04 np0005470441 nova_compute[192626]: 2025-10-04 05:40:04.063 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:40:04 np0005470441 nova_compute[192626]: 2025-10-04 05:40:04.063 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:05 np0005470441 nova_compute[192626]: 2025-10-04 05:40:05.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:05 np0005470441 podman[226364]: 2025-10-04 05:40:05.325262209 +0000 UTC m=+0.077501314 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:40:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:06.747 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:06.747 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:06.747 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.063 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updating instance_info_cache with network_info: [{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.080 2 INFO nova.compute.manager [None req-9a1d04a0-9b55-4677-ac30-b07a5dc57f48 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Get console output#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.082 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.082 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.083 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.084 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.084 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.084 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.085 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.085 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.089 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.115 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.115 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.115 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 01:40:07 np0005470441 podman[226388]: 2025-10-04 05:40:07.317449153 +0000 UTC m=+0.075375323 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:07 np0005470441 nova_compute[192626]: 2025-10-04 05:40:07.762 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.197 2 INFO nova.compute.manager [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Rebuilding instance#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.246 2 INFO nova.compute.manager [None req-fde932bf-9fdc-44f7-bd03-43c2928fce22 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Get console output#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.252 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.444 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.463 2 DEBUG nova.compute.manager [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.507 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_requests' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.526 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.542 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'resources' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.554 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'migration_context' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.574 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.577 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.577 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.577 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.578 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.578 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.579 2 INFO nova.compute.manager [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Terminating instance#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.580 2 DEBUG nova.compute.manager [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.581 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  4 01:40:08 np0005470441 kernel: tap1c32865e-e1 (unregistering): left promiscuous mode
Oct  4 01:40:08 np0005470441 NetworkManager[51690]: <info>  [1759556408.6072] device (tap1c32865e-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:08Z|00216|binding|INFO|Releasing lport 1c32865e-e189-4f96-b7d5-f3c3a5136407 from this chassis (sb_readonly=0)
Oct  4 01:40:08 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:08Z|00217|binding|INFO|Setting lport 1c32865e-e189-4f96-b7d5-f3c3a5136407 down in Southbound
Oct  4 01:40:08 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:08Z|00218|binding|INFO|Removing iface tap1c32865e-e1 ovn-installed in OVS
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.632 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:b1:43 10.100.0.8'], port_security=['fa:16:3e:44:b1:43 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '369c5da6-9c6d-48e7-a402-88f996ed8276', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': '544dc3d9-8b22-4833-9188-b1166a076883', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.215'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=81506072-2e1b-4219-b643-c8187215261f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=1c32865e-e189-4f96-b7d5-f3c3a5136407) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.633 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 1c32865e-e189-4f96-b7d5-f3c3a5136407 in datapath 04999c96-51a9-44fc-b4c8-a6213c9bc268 unbound from our chassis#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.633 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 04999c96-51a9-44fc-b4c8-a6213c9bc268, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.634 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[66cdd673-7d58-497b-b1dd-238084408f39]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.635 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268 namespace which is not needed anymore#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Oct  4 01:40:08 np0005470441 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001d.scope: Consumed 13.068s CPU time.
Oct  4 01:40:08 np0005470441 systemd-machined[152624]: Machine qemu-14-instance-0000001d terminated.
Oct  4 01:40:08 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [NOTICE]   (226087) : haproxy version is 2.8.14-c23fe91
Oct  4 01:40:08 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [NOTICE]   (226087) : path to executable is /usr/sbin/haproxy
Oct  4 01:40:08 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [WARNING]  (226087) : Exiting Master process...
Oct  4 01:40:08 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [ALERT]    (226087) : Current worker (226097) exited with code 143 (Terminated)
Oct  4 01:40:08 np0005470441 neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268[226060]: [WARNING]  (226087) : All workers exited. Exiting... (0)
Oct  4 01:40:08 np0005470441 systemd[1]: libpod-afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832.scope: Deactivated successfully.
Oct  4 01:40:08 np0005470441 podman[226430]: 2025-10-04 05:40:08.766700416 +0000 UTC m=+0.047959904 container died afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:40:08 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832-userdata-shm.mount: Deactivated successfully.
Oct  4 01:40:08 np0005470441 systemd[1]: var-lib-containers-storage-overlay-17752b182b67c71958fc326891b03996b72c2689dcd3477be31e7a9b81d22e8e-merged.mount: Deactivated successfully.
Oct  4 01:40:08 np0005470441 podman[226430]: 2025-10-04 05:40:08.817744227 +0000 UTC m=+0.099003705 container cleanup afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  4 01:40:08 np0005470441 systemd[1]: libpod-conmon-afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832.scope: Deactivated successfully.
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.848 2 INFO nova.virt.libvirt.driver [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Instance destroyed successfully.#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.849 2 DEBUG nova.objects.instance [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 369c5da6-9c6d-48e7-a402-88f996ed8276 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.867 2 DEBUG nova.virt.libvirt.vif [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-971816448',display_name='tempest-TestNetworkBasicOps-server-971816448',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-971816448',id=29,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHibsOavvT5Q22UxMUeFHlMlfvZtwgbPzWkMoHbvilBwdbM4rdSMfjdqdv+XQ5xhGsZ0gFbusTh5D97rwOcKuML1QRTuZJX9N8yVKv1zSUJMFdE0q9S5hSidpnOFsu8Dfg==',key_name='tempest-TestNetworkBasicOps-778878579',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:39:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-1vg8r9z1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:39:49Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=369c5da6-9c6d-48e7-a402-88f996ed8276,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.867 2 DEBUG nova.network.os_vif_util [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "address": "fa:16:3e:44:b1:43", "network": {"id": "04999c96-51a9-44fc-b4c8-a6213c9bc268", "bridge": "br-int", "label": "tempest-network-smoke--1882843391", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1c32865e-e1", "ovs_interfaceid": "1c32865e-e189-4f96-b7d5-f3c3a5136407", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.868 2 DEBUG nova.network.os_vif_util [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.868 2 DEBUG os_vif [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.870 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1c32865e-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.876 2 INFO os_vif [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:44:b1:43,bridge_name='br-int',has_traffic_filtering=True,id=1c32865e-e189-4f96-b7d5-f3c3a5136407,network=Network(04999c96-51a9-44fc-b4c8-a6213c9bc268),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1c32865e-e1')#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.876 2 INFO nova.virt.libvirt.driver [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Deleting instance files /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276_del#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.877 2 INFO nova.virt.libvirt.driver [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Deletion of /var/lib/nova/instances/369c5da6-9c6d-48e7-a402-88f996ed8276_del complete#033[00m
Oct  4 01:40:08 np0005470441 podman[226472]: 2025-10-04 05:40:08.89243402 +0000 UTC m=+0.047773639 container remove afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.898 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9e9b15-3ff8-437f-a67f-c620b01efd55]: (4, ('Sat Oct  4 05:40:08 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268 (afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832)\nafc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832\nSat Oct  4 05:40:08 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268 (afc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832)\nafc624362d6134e9caccb422fb2010ba69f7941d785fdb752dcc704b1abba832\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.900 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1a5d37-f69f-4722-88ec-995682da23c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.901 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04999c96-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 kernel: tap04999c96-50: left promiscuous mode
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.916 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f19994aa-d117-49e0-8b1c-d440cd4af22c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.934 2 INFO nova.compute.manager [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.935 2 DEBUG oslo.service.loopingcall [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.936 2 DEBUG nova.compute.manager [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:40:08 np0005470441 nova_compute[192626]: 2025-10-04 05:40:08.936 2 DEBUG nova.network.neutron [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.942 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[fd406650-b2f9-4df3-bfef-239380e14a42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.944 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e675f264-4a8a-4c36-a57c-e913a33b94ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.958 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b59234d2-bd85-45c0-9ea8-0c78acbd0971]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428568, 'reachable_time': 27131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226490, 'error': None, 'target': 'ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.960 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-04999c96-51a9-44fc-b4c8-a6213c9bc268 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:40:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:08.960 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[e70be0f5-956b-42fb-86f7-e5e7f63ef2ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:08 np0005470441 systemd[1]: run-netns-ovnmeta\x2d04999c96\x2d51a9\x2d44fc\x2db4c8\x2da6213c9bc268.mount: Deactivated successfully.
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.021 2 DEBUG nova.compute.manager [req-6937bfdf-7d0c-448d-9eef-57479f4a0b3a req-73b8b71a-128e-422c-bf18-5b5a1f7e5d1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-vif-unplugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.022 2 DEBUG oslo_concurrency.lockutils [req-6937bfdf-7d0c-448d-9eef-57479f4a0b3a req-73b8b71a-128e-422c-bf18-5b5a1f7e5d1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.022 2 DEBUG oslo_concurrency.lockutils [req-6937bfdf-7d0c-448d-9eef-57479f4a0b3a req-73b8b71a-128e-422c-bf18-5b5a1f7e5d1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.023 2 DEBUG oslo_concurrency.lockutils [req-6937bfdf-7d0c-448d-9eef-57479f4a0b3a req-73b8b71a-128e-422c-bf18-5b5a1f7e5d1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.023 2 DEBUG nova.compute.manager [req-6937bfdf-7d0c-448d-9eef-57479f4a0b3a req-73b8b71a-128e-422c-bf18-5b5a1f7e5d1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] No waiting events found dispatching network-vif-unplugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.024 2 DEBUG nova.compute.manager [req-6937bfdf-7d0c-448d-9eef-57479f4a0b3a req-73b8b71a-128e-422c-bf18-5b5a1f7e5d1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-vif-unplugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.947 2 DEBUG nova.network.neutron [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.980 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556394.9790401, 9e639d23-2cac-4fb4-a915-d88dfa03aad4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.981 2 INFO nova.compute.manager [-] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:40:09 np0005470441 nova_compute[192626]: 2025-10-04 05:40:09.982 2 INFO nova.compute.manager [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Took 1.05 seconds to deallocate network for instance.#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.004 2 DEBUG nova.compute.manager [None req-c0363d1c-0ff3-4fb7-a6e6-de3c2133b572 - - - - - -] [instance: 9e639d23-2cac-4fb4-a915-d88dfa03aad4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.040 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.041 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.134 2 DEBUG nova.compute.provider_tree [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.153 2 DEBUG nova.scheduler.client.report [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.180 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.231 2 DEBUG nova.compute.manager [req-b60f583a-11fa-4516-82ae-850496c6261d req-bf15a09a-1385-4389-ae3b-93a161f8e093 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-vif-deleted-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.236 2 INFO nova.scheduler.client.report [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 369c5da6-9c6d-48e7-a402-88f996ed8276#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.318 2 DEBUG oslo_concurrency.lockutils [None req-54948d5a-d4fb-489a-8bf7-b64aa3fe84fa b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:10 np0005470441 kernel: tapea5d5ce9-b8 (unregistering): left promiscuous mode
Oct  4 01:40:10 np0005470441 NetworkManager[51690]: <info>  [1759556410.7739] device (tapea5d5ce9-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:10 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:10Z|00219|binding|INFO|Releasing lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 from this chassis (sb_readonly=0)
Oct  4 01:40:10 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:10Z|00220|binding|INFO|Setting lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 down in Southbound
Oct  4 01:40:10 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:10Z|00221|binding|INFO|Removing iface tapea5d5ce9-b8 ovn-installed in OVS
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:10.806 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:6c:37 10.100.0.8'], port_security=['fa:16:3e:94:6c:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1116f56-9520-48d8-8bb2-2519f97b3338', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '48042eb9-5c9a-49d3-9ddf-88f8ff74c14b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67ce5926-63fc-4d20-a2ee-8b5c0eb6e716, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:40:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:10.807 103689 INFO neutron.agent.ovn.metadata.agent [-] Port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 in datapath f1116f56-9520-48d8-8bb2-2519f97b3338 unbound from our chassis#033[00m
Oct  4 01:40:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:10.809 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1116f56-9520-48d8-8bb2-2519f97b3338, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:40:10 np0005470441 nova_compute[192626]: 2025-10-04 05:40:10.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:10.810 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb8c5ef-eef0-4354-967f-046467408bda]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:10.812 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 namespace which is not needed anymore#033[00m
Oct  4 01:40:10 np0005470441 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct  4 01:40:10 np0005470441 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001c.scope: Consumed 13.619s CPU time.
Oct  4 01:40:10 np0005470441 systemd-machined[152624]: Machine qemu-15-instance-0000001c terminated.
Oct  4 01:40:10 np0005470441 podman[226491]: 2025-10-04 05:40:10.885548892 +0000 UTC m=+0.089484705 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:40:10 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [NOTICE]   (226170) : haproxy version is 2.8.14-c23fe91
Oct  4 01:40:10 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [NOTICE]   (226170) : path to executable is /usr/sbin/haproxy
Oct  4 01:40:10 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [WARNING]  (226170) : Exiting Master process...
Oct  4 01:40:10 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [ALERT]    (226170) : Current worker (226172) exited with code 143 (Terminated)
Oct  4 01:40:10 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226166]: [WARNING]  (226170) : All workers exited. Exiting... (0)
Oct  4 01:40:10 np0005470441 systemd[1]: libpod-4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07.scope: Deactivated successfully.
Oct  4 01:40:10 np0005470441 podman[226539]: 2025-10-04 05:40:10.958157376 +0000 UTC m=+0.055362535 container died 4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:40:10 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07-userdata-shm.mount: Deactivated successfully.
Oct  4 01:40:10 np0005470441 systemd[1]: var-lib-containers-storage-overlay-92e9c9aa52db566a69381c205bfff2ce7d8a9da7e1dd113e8de7e02800f7ca0e-merged.mount: Deactivated successfully.
Oct  4 01:40:10 np0005470441 podman[226539]: 2025-10-04 05:40:10.991429271 +0000 UTC m=+0.088634430 container cleanup 4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:40:11 np0005470441 systemd[1]: libpod-conmon-4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07.scope: Deactivated successfully.
Oct  4 01:40:11 np0005470441 NetworkManager[51690]: <info>  [1759556411.0077] manager: (tapea5d5ce9-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 podman[226572]: 2025-10-04 05:40:11.058545639 +0000 UTC m=+0.043645642 container remove 4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.064 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6657ee-eca7-4375-8c71-0022503e3c26]: (4, ('Sat Oct  4 05:40:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 (4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07)\n4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07\nSat Oct  4 05:40:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 (4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07)\n4aa33a9d536aa61192e2804ac62c9c9b82abc446c764f72c702c7848522f1e07\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.065 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7503706b-996b-433a-a665-ab059552d36f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.066 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1116f56-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 kernel: tapf1116f56-90: left promiscuous mode
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.084 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[27637b29-4016-4236-89db-ec3814118b19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.115 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ae49d7a2-da33-4096-b471-327e675f54d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.116 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1ef31b-e109-464b-9471-ea96a31f531a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.128 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd65261-07fb-4424-908f-706836b5aa9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 428654, 'reachable_time': 23991, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226603, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.131 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:40:11 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:11.131 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[8512dd33-5dc2-4bc8-ae0e-f12e87e9cdcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:11 np0005470441 systemd[1]: run-netns-ovnmeta\x2df1116f56\x2d9520\x2d48d8\x2d8bb2\x2d2519f97b3338.mount: Deactivated successfully.
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.221 2 DEBUG nova.compute.manager [req-5832683b-089a-4dc9-8ab5-776c3e4f78cd req-1ba8b82a-4f4c-46c8-a4b5-061a19e586be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.222 2 DEBUG oslo_concurrency.lockutils [req-5832683b-089a-4dc9-8ab5-776c3e4f78cd req-1ba8b82a-4f4c-46c8-a4b5-061a19e586be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.222 2 DEBUG oslo_concurrency.lockutils [req-5832683b-089a-4dc9-8ab5-776c3e4f78cd req-1ba8b82a-4f4c-46c8-a4b5-061a19e586be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.222 2 DEBUG oslo_concurrency.lockutils [req-5832683b-089a-4dc9-8ab5-776c3e4f78cd req-1ba8b82a-4f4c-46c8-a4b5-061a19e586be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "369c5da6-9c6d-48e7-a402-88f996ed8276-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.223 2 DEBUG nova.compute.manager [req-5832683b-089a-4dc9-8ab5-776c3e4f78cd req-1ba8b82a-4f4c-46c8-a4b5-061a19e586be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] No waiting events found dispatching network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.223 2 WARNING nova.compute.manager [req-5832683b-089a-4dc9-8ab5-776c3e4f78cd req-1ba8b82a-4f4c-46c8-a4b5-061a19e586be 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Received unexpected event network-vif-plugged-1c32865e-e189-4f96-b7d5-f3c3a5136407 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.605 2 INFO nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance shutdown successfully after 3 seconds.#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.610 2 INFO nova.virt.libvirt.driver [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance destroyed successfully.#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.613 2 INFO nova.virt.libvirt.driver [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance destroyed successfully.#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.614 2 DEBUG nova.virt.libvirt.vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-506251172',display_name='tempest-TestNetworkAdvancedServerOps-server-506251172',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-506251172',id=28,image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIjXhDdMZBitPDdhOijnlDosYzWTa95Sosui3/U7Aj23EXkPreyDXr77ZxqvYSkIYSs4SsfMo+dHVtQsDAqtEMSF48ZFb97HgEie6xjWesHmfe4SD9fho4cwWF6eCwTY/g==',key_name='tempest-TestNetworkAdvancedServerOps-512019839',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:39:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-pop27zoe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:40:07Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.614 2 DEBUG nova.network.os_vif_util [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.615 2 DEBUG nova.network.os_vif_util [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.615 2 DEBUG os_vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.617 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea5d5ce9-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.622 2 INFO os_vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8')#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.623 2 INFO nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Deleting instance files /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016_del#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.624 2 INFO nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Deletion of /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016_del complete#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.819 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.819 2 INFO nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Creating image(s)#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.821 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.821 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.823 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.823 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "e27831ca76d76d559705407a4522553250354951" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:11 np0005470441 nova_compute[192626]: 2025-10-04 05:40:11.824 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "e27831ca76d76d559705407a4522553250354951" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:12 np0005470441 nova_compute[192626]: 2025-10-04 05:40:12.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.031 2 DEBUG nova.compute.manager [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-unplugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.031 2 DEBUG oslo_concurrency.lockutils [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.031 2 DEBUG oslo_concurrency.lockutils [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.031 2 DEBUG oslo_concurrency.lockutils [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.032 2 DEBUG nova.compute.manager [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] No waiting events found dispatching network-vif-unplugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.032 2 WARNING nova.compute.manager [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received unexpected event network-vif-unplugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.032 2 DEBUG nova.compute.manager [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.032 2 DEBUG oslo_concurrency.lockutils [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.032 2 DEBUG oslo_concurrency.lockutils [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.033 2 DEBUG oslo_concurrency.lockutils [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.033 2 DEBUG nova.compute.manager [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] No waiting events found dispatching network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.033 2 WARNING nova.compute.manager [req-eba91099-f027-4545-980f-f63cdb11dcd7 req-faf09153-b565-4734-8dc4-e6a2cc854fe7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received unexpected event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.323 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.387 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.part --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.388 2 DEBUG nova.virt.images [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] 49dfbd2a-dfaa-487f-a950-ea6e453241db was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.390 2 DEBUG nova.privsep.utils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.391 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.part /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.959 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.part /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.converted" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:13 np0005470441 nova_compute[192626]: 2025-10-04 05:40:13.965 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.054 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951.converted --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.056 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "e27831ca76d76d559705407a4522553250354951" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.074 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.129 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.131 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "e27831ca76d76d559705407a4522553250354951" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.132 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "e27831ca76d76d559705407a4522553250354951" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.160 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.218 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.219 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951,backing_fmt=raw /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.255 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951,backing_fmt=raw /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.257 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "e27831ca76d76d559705407a4522553250354951" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.257 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.313 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e27831ca76d76d559705407a4522553250354951 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.314 2 DEBUG nova.virt.disk.api [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Checking if we can resize image /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.314 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.373 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.374 2 DEBUG nova.virt.disk.api [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Cannot resize image /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.375 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.375 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Ensure instance console log exists: /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.376 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.376 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.376 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.379 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Start _get_guest_xml network_info=[{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:26Z,direct_url=<?>,disk_format='qcow2',id=49dfbd2a-dfaa-487f-a950-ea6e453241db,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:27Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.384 2 WARNING nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.389 2 DEBUG nova.virt.libvirt.host [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.389 2 DEBUG nova.virt.libvirt.host [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.392 2 DEBUG nova.virt.libvirt.host [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.393 2 DEBUG nova.virt.libvirt.host [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.394 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.394 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:26Z,direct_url=<?>,disk_format='qcow2',id=49dfbd2a-dfaa-487f-a950-ea6e453241db,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:27Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.395 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.395 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.395 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.395 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.396 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.396 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.396 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.396 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.396 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.397 2 DEBUG nova.virt.hardware [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.397 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.420 2 DEBUG nova.virt.libvirt.vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-506251172',display_name='tempest-TestNetworkAdvancedServerOps-server-506251172',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-506251172',id=28,image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIjXhDdMZBitPDdhOijnlDosYzWTa95Sosui3/U7Aj23EXkPreyDXr77ZxqvYSkIYSs4SsfMo+dHVtQsDAqtEMSF48ZFb97HgEie6xjWesHmfe4SD9fho4cwWF6eCwTY/g==',key_name='tempest-TestNetworkAdvancedServerOps-512019839',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:39:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-pop27zoe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:40:11Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.421 2 DEBUG nova.network.os_vif_util [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.421 2 DEBUG nova.network.os_vif_util [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.423 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <uuid>1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016</uuid>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <name>instance-0000001c</name>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-506251172</nova:name>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:40:14</nova:creationTime>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:user uuid="d65c768451494a3f9e4f9a238fa5c40d">tempest-TestNetworkAdvancedServerOps-1635331179-project-member</nova:user>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:project uuid="d0c087ea0f62444e80490916b42c760f">tempest-TestNetworkAdvancedServerOps-1635331179</nova:project>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="49dfbd2a-dfaa-487f-a950-ea6e453241db"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        <nova:port uuid="ea5d5ce9-b8c0-45ae-8462-bfa1288280c9">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <entry name="serial">1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016</entry>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <entry name="uuid">1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016</entry>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:94:6c:37"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <target dev="tapea5d5ce9-b8"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/console.log" append="off"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:40:14 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:40:14 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:40:14 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:40:14 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.424 2 DEBUG nova.virt.libvirt.vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-506251172',display_name='tempest-TestNetworkAdvancedServerOps-server-506251172',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-506251172',id=28,image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIjXhDdMZBitPDdhOijnlDosYzWTa95Sosui3/U7Aj23EXkPreyDXr77ZxqvYSkIYSs4SsfMo+dHVtQsDAqtEMSF48ZFb97HgEie6xjWesHmfe4SD9fho4cwWF6eCwTY/g==',key_name='tempest-TestNetworkAdvancedServerOps-512019839',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:39:49Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-pop27zoe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:40:11Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.425 2 DEBUG nova.network.os_vif_util [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.425 2 DEBUG nova.network.os_vif_util [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.426 2 DEBUG os_vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.427 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.427 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.430 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea5d5ce9-b8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.430 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea5d5ce9-b8, col_values=(('external_ids', {'iface-id': 'ea5d5ce9-b8c0-45ae-8462-bfa1288280c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:94:6c:37', 'vm-uuid': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:14 np0005470441 NetworkManager[51690]: <info>  [1759556414.4330] manager: (tapea5d5ce9-b8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.440 2 INFO os_vif [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8')#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.488 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.489 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.489 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No VIF found with MAC fa:16:3e:94:6c:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.489 2 INFO nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Using config drive#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.502 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:14 np0005470441 nova_compute[192626]: 2025-10-04 05:40:14.572 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'keypairs' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.257 2 INFO nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Creating config drive at /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config#033[00m
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.266 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplmwyrete execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.395 2 DEBUG oslo_concurrency.processutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplmwyrete" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:15 np0005470441 kernel: tapea5d5ce9-b8: entered promiscuous mode
Oct  4 01:40:15 np0005470441 NetworkManager[51690]: <info>  [1759556415.4817] manager: (tapea5d5ce9-b8): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Oct  4 01:40:15 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:15Z|00222|binding|INFO|Claiming lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for this chassis.
Oct  4 01:40:15 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:15Z|00223|binding|INFO|ea5d5ce9-b8c0-45ae-8462-bfa1288280c9: Claiming fa:16:3e:94:6c:37 10.100.0.8
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.490 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:6c:37 10.100.0.8'], port_security=['fa:16:3e:94:6c:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1116f56-9520-48d8-8bb2-2519f97b3338', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '48042eb9-5c9a-49d3-9ddf-88f8ff74c14b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67ce5926-63fc-4d20-a2ee-8b5c0eb6e716, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.492 103689 INFO neutron.agent.ovn.metadata.agent [-] Port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 in datapath f1116f56-9520-48d8-8bb2-2519f97b3338 bound to our chassis#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.494 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1116f56-9520-48d8-8bb2-2519f97b3338#033[00m
Oct  4 01:40:15 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:15Z|00224|binding|INFO|Setting lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 ovn-installed in OVS
Oct  4 01:40:15 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:15Z|00225|binding|INFO|Setting lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 up in Southbound
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:15 np0005470441 systemd-udevd[226655]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.514 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6e1737cb-7951-4882-b5d8-bf010338a597]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.516 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf1116f56-91 in ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.518 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf1116f56-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.518 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1dbd9653-0471-4fb7-9376-817ad0b986e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.519 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6d070177-7f1c-46ee-a693-21da2818ce15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 systemd-machined[152624]: New machine qemu-16-instance-0000001c.
Oct  4 01:40:15 np0005470441 NetworkManager[51690]: <info>  [1759556415.5317] device (tapea5d5ce9-b8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:40:15 np0005470441 NetworkManager[51690]: <info>  [1759556415.5325] device (tapea5d5ce9-b8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.531 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[74c8aff2-ba6a-4243-af01-6707d0ee799a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.543 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a6787747-4cfa-481b-a5fe-1b67d8d7ddc2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 systemd[1]: Started Virtual Machine qemu-16-instance-0000001c.
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.574 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[e5044559-3237-4cfa-b6bf-8aa51b39beae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.578 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bf82fbe7-68c6-4e19-b9b5-e497857a5d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 NetworkManager[51690]: <info>  [1759556415.5796] manager: (tapf1116f56-90): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.607 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[8a838d38-c40a-41b7-b7d8-4aed9f11e39c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.610 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c027d0b3-dc0e-4f97-b012-9d4de11b5f9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 NetworkManager[51690]: <info>  [1759556415.6313] device (tapf1116f56-90): carrier: link connected
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.636 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[e39e324f-ee23-49b3-9ba7-d2f8a4c1713b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.652 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f011d5e0-6ff7-4429-a977-527c166ad3b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1116f56-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:0b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431328, 'reachable_time': 35473, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226689, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.666 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[30872f63-affa-4dba-8a43-129402b6fb8a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe11:b52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431328, 'tstamp': 431328}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226690, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.682 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9ab852ce-903f-47cb-9fab-480978cd158c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1116f56-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:11:0b:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431328, 'reachable_time': 35473, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226691, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.708 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b4935b83-c4b2-4e43-948a-8ca33b900484]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.760 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2424c1e4-c024-4cc3-8ce1-3c4436266500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.761 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1116f56-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.764 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.765 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1116f56-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:15 np0005470441 kernel: tapf1116f56-90: entered promiscuous mode
Oct  4 01:40:15 np0005470441 NetworkManager[51690]: <info>  [1759556415.7681] manager: (tapf1116f56-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.771 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1116f56-90, col_values=(('external_ids', {'iface-id': '97bc4ce0-5f5c-4023-9415-069692deb3ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:15 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:15Z|00226|binding|INFO|Releasing lport 97bc4ce0-5f5c-4023-9415-069692deb3ec from this chassis (sb_readonly=0)
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.774 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f1116f56-9520-48d8-8bb2-2519f97b3338.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f1116f56-9520-48d8-8bb2-2519f97b3338.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.774 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[257e0d3a-fda6-4d81-8125-c1fa00e8ea0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.775 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-f1116f56-9520-48d8-8bb2-2519f97b3338
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/f1116f56-9520-48d8-8bb2-2519f97b3338.pid.haproxy
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID f1116f56-9520-48d8-8bb2-2519f97b3338
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:40:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:15.776 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'env', 'PROCESS_TAG=haproxy-f1116f56-9520-48d8-8bb2-2519f97b3338', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f1116f56-9520-48d8-8bb2-2519f97b3338.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:40:15 np0005470441 nova_compute[192626]: 2025-10-04 05:40:15.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:16 np0005470441 podman[226723]: 2025-10-04 05:40:16.156624263 +0000 UTC m=+0.053672286 container create fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:40:16 np0005470441 systemd[1]: Started libpod-conmon-fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301.scope.
Oct  4 01:40:16 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:40:16 np0005470441 podman[226723]: 2025-10-04 05:40:16.125929961 +0000 UTC m=+0.022978024 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:40:16 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81acea7a39d6a1e13601525d532819306cf46ad161da5e12d6029feed16dc289/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:40:16 np0005470441 podman[226723]: 2025-10-04 05:40:16.236552885 +0000 UTC m=+0.133600908 container init fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  4 01:40:16 np0005470441 podman[226723]: 2025-10-04 05:40:16.241857256 +0000 UTC m=+0.138905279 container start fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  4 01:40:16 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [NOTICE]   (226742) : New worker (226744) forked
Oct  4 01:40:16 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [NOTICE]   (226742) : Loading success.
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.222 2 DEBUG nova.compute.manager [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.223 2 DEBUG oslo_concurrency.lockutils [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.223 2 DEBUG oslo_concurrency.lockutils [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.223 2 DEBUG oslo_concurrency.lockutils [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.223 2 DEBUG nova.compute.manager [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] No waiting events found dispatching network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.224 2 WARNING nova.compute.manager [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received unexpected event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.224 2 DEBUG nova.compute.manager [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.224 2 DEBUG oslo_concurrency.lockutils [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.224 2 DEBUG oslo_concurrency.lockutils [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.224 2 DEBUG oslo_concurrency.lockutils [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.225 2 DEBUG nova.compute.manager [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] No waiting events found dispatching network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.225 2 WARNING nova.compute.manager [req-cc1a24d0-0cad-453f-a8c9-7453e1560132 req-4b8670af-9bf8-488d-8125-2f3e25b0dd11 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received unexpected event network-vif-plugged-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.283 2 DEBUG nova.compute.manager [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.284 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.284 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Removed pending event for 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.284 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556417.2817261, 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.284 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.288 2 INFO nova.virt.libvirt.driver [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance spawned successfully.#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.288 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.317 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.322 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.322 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.323 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.323 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.323 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.324 2 DEBUG nova.virt.libvirt.driver [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.328 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.364 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.364 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556417.2822635, 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.364 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] VM Started (Lifecycle Event)#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.392 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.396 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.403 2 DEBUG nova.compute.manager [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.414 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.463 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.463 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.464 2 DEBUG nova.objects.instance [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.531 2 DEBUG oslo_concurrency.lockutils [None req-9026907e-7f19-4fa5-a6ce-9cc1518970fd d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:17 np0005470441 nova_compute[192626]: 2025-10-04 05:40:17.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:18 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:18Z|00227|binding|INFO|Releasing lport 97bc4ce0-5f5c-4023-9415-069692deb3ec from this chassis (sb_readonly=0)
Oct  4 01:40:18 np0005470441 nova_compute[192626]: 2025-10-04 05:40:18.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:19 np0005470441 podman[226761]: 2025-10-04 05:40:19.30144729 +0000 UTC m=+0.051659239 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:40:19 np0005470441 podman[226760]: 2025-10-04 05:40:19.333304325 +0000 UTC m=+0.086643403 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:40:19 np0005470441 nova_compute[192626]: 2025-10-04 05:40:19.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:21 np0005470441 nova_compute[192626]: 2025-10-04 05:40:21.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:22 np0005470441 nova_compute[192626]: 2025-10-04 05:40:22.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:22.255 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:40:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:22.257 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:40:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:22.258 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:22 np0005470441 nova_compute[192626]: 2025-10-04 05:40:22.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:23 np0005470441 nova_compute[192626]: 2025-10-04 05:40:23.846 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556408.843837, 369c5da6-9c6d-48e7-a402-88f996ed8276 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:23 np0005470441 nova_compute[192626]: 2025-10-04 05:40:23.846 2 INFO nova.compute.manager [-] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:40:23 np0005470441 nova_compute[192626]: 2025-10-04 05:40:23.866 2 DEBUG nova.compute.manager [None req-86c2d1b6-bb6a-4d7d-8487-ffaedd9927ee - - - - - -] [instance: 369c5da6-9c6d-48e7-a402-88f996ed8276] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:24 np0005470441 podman[226802]: 2025-10-04 05:40:24.306190202 +0000 UTC m=+0.051912516 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:40:24 np0005470441 podman[226803]: 2025-10-04 05:40:24.317528075 +0000 UTC m=+0.061189260 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  4 01:40:24 np0005470441 nova_compute[192626]: 2025-10-04 05:40:24.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:27 np0005470441 nova_compute[192626]: 2025-10-04 05:40:27.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:28Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:94:6c:37 10.100.0.8
Oct  4 01:40:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:28Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:94:6c:37 10.100.0.8
Oct  4 01:40:29 np0005470441 nova_compute[192626]: 2025-10-04 05:40:29.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:32 np0005470441 podman[226852]: 2025-10-04 05:40:32.298439951 +0000 UTC m=+0.055703434 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:40:32 np0005470441 nova_compute[192626]: 2025-10-04 05:40:32.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:33 np0005470441 nova_compute[192626]: 2025-10-04 05:40:33.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:34 np0005470441 nova_compute[192626]: 2025-10-04 05:40:34.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:34 np0005470441 nova_compute[192626]: 2025-10-04 05:40:34.757 2 INFO nova.compute.manager [None req-46eb8c84-139a-4e7a-b3de-58912d3bf406 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Get console output#033[00m
Oct  4 01:40:34 np0005470441 nova_compute[192626]: 2025-10-04 05:40:34.762 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:40:36 np0005470441 podman[226875]: 2025-10-04 05:40:36.31333839 +0000 UTC m=+0.062049735 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.675 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.675 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.676 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.676 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.676 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.677 2 INFO nova.compute.manager [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Terminating instance#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.678 2 DEBUG nova.compute.manager [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:40:36 np0005470441 kernel: tapea5d5ce9-b8 (unregistering): left promiscuous mode
Oct  4 01:40:36 np0005470441 NetworkManager[51690]: <info>  [1759556436.6964] device (tapea5d5ce9-b8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:36Z|00228|binding|INFO|Releasing lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 from this chassis (sb_readonly=0)
Oct  4 01:40:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:36Z|00229|binding|INFO|Setting lport ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 down in Southbound
Oct  4 01:40:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:36Z|00230|binding|INFO|Removing iface tapea5d5ce9-b8 ovn-installed in OVS
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.721 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:94:6c:37 10.100.0.8'], port_security=['fa:16:3e:94:6c:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1116f56-9520-48d8-8bb2-2519f97b3338', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '48042eb9-5c9a-49d3-9ddf-88f8ff74c14b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=67ce5926-63fc-4d20-a2ee-8b5c0eb6e716, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.722 103689 INFO neutron.agent.ovn.metadata.agent [-] Port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 in datapath f1116f56-9520-48d8-8bb2-2519f97b3338 unbound from our chassis#033[00m
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.723 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1116f56-9520-48d8-8bb2-2519f97b3338, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.724 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[86a26f94-933e-4c8a-b34d-7f0edd7b95b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.725 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 namespace which is not needed anymore#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.760 2 DEBUG nova.compute.manager [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-changed-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.760 2 DEBUG nova.compute.manager [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Refreshing instance network info cache due to event network-changed-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.761 2 DEBUG oslo_concurrency.lockutils [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.761 2 DEBUG oslo_concurrency.lockutils [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.761 2 DEBUG nova.network.neutron [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Refreshing network info cache for port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:40:36 np0005470441 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct  4 01:40:36 np0005470441 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000001c.scope: Consumed 14.605s CPU time.
Oct  4 01:40:36 np0005470441 systemd-machined[152624]: Machine qemu-16-instance-0000001c terminated.
Oct  4 01:40:36 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [NOTICE]   (226742) : haproxy version is 2.8.14-c23fe91
Oct  4 01:40:36 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [NOTICE]   (226742) : path to executable is /usr/sbin/haproxy
Oct  4 01:40:36 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [WARNING]  (226742) : Exiting Master process...
Oct  4 01:40:36 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [WARNING]  (226742) : Exiting Master process...
Oct  4 01:40:36 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [ALERT]    (226742) : Current worker (226744) exited with code 143 (Terminated)
Oct  4 01:40:36 np0005470441 neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338[226738]: [WARNING]  (226742) : All workers exited. Exiting... (0)
Oct  4 01:40:36 np0005470441 systemd[1]: libpod-fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301.scope: Deactivated successfully.
Oct  4 01:40:36 np0005470441 podman[226924]: 2025-10-04 05:40:36.84813828 +0000 UTC m=+0.045660359 container died fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  4 01:40:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301-userdata-shm.mount: Deactivated successfully.
Oct  4 01:40:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay-81acea7a39d6a1e13601525d532819306cf46ad161da5e12d6029feed16dc289-merged.mount: Deactivated successfully.
Oct  4 01:40:36 np0005470441 podman[226924]: 2025-10-04 05:40:36.883241507 +0000 UTC m=+0.080763586 container cleanup fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:40:36 np0005470441 systemd[1]: libpod-conmon-fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301.scope: Deactivated successfully.
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.936 2 INFO nova.virt.libvirt.driver [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Instance destroyed successfully.#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.936 2 DEBUG nova.objects.instance [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'resources' on Instance uuid 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:36 np0005470441 podman[226956]: 2025-10-04 05:40:36.941956616 +0000 UTC m=+0.036564010 container remove fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.946 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c1063723-62fc-4386-8cd8-e4ac9df64c26]: (4, ('Sat Oct  4 05:40:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 (fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301)\nfe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301\nSat Oct  4 05:40:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 (fe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301)\nfe8260c0ca7eb47c0b73a0ead7d033d6708bf3ef05302be6a8005c42a77d0301\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.947 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[810dd8d0-ee15-4445-9d03-b7020e4bd1e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:36.948 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1116f56-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.959 2 DEBUG nova.virt.libvirt.vif [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-10-04T05:39:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-506251172',display_name='tempest-TestNetworkAdvancedServerOps-server-506251172',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-506251172',id=28,image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIjXhDdMZBitPDdhOijnlDosYzWTa95Sosui3/U7Aj23EXkPreyDXr77ZxqvYSkIYSs4SsfMo+dHVtQsDAqtEMSF48ZFb97HgEie6xjWesHmfe4SD9fho4cwWF6eCwTY/g==',key_name='tempest-TestNetworkAdvancedServerOps-512019839',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-pop27zoe',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='49dfbd2a-dfaa-487f-a950-ea6e453241db',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:17Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.959 2 DEBUG nova.network.os_vif_util [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.960 2 DEBUG nova.network.os_vif_util [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.960 2 DEBUG os_vif [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.962 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea5d5ce9-b8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:36 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:36 np0005470441 kernel: tapf1116f56-90: left promiscuous mode
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:36.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.001 2 INFO os_vif [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:94:6c:37,bridge_name='br-int',has_traffic_filtering=True,id=ea5d5ce9-b8c0-45ae-8462-bfa1288280c9,network=Network(f1116f56-9520-48d8-8bb2-2519f97b3338),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea5d5ce9-b8')#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.002 2 INFO nova.virt.libvirt.driver [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Deleting instance files /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016_del#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.002 2 INFO nova.virt.libvirt.driver [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Deletion of /var/lib/nova/instances/1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016_del complete#033[00m
Oct  4 01:40:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:37.003 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2c88090e-6783-403f-b809-efeaaa54fbae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:37.033 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[eafc2996-a637-4f23-9661-c182fea7862a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:37.034 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7903bc-58ab-46c7-b896-82d322097cc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:37.055 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d62bd285-7d4a-4515-b89d-4c338b429404]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431322, 'reachable_time': 30152, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226984, 'error': None, 'target': 'ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:37 np0005470441 systemd[1]: run-netns-ovnmeta\x2df1116f56\x2d9520\x2d48d8\x2d8bb2\x2d2519f97b3338.mount: Deactivated successfully.
Oct  4 01:40:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:37.058 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f1116f56-9520-48d8-8bb2-2519f97b3338 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:40:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:37.059 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[d345ac87-8273-4e7a-9907-f54d7addeeb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.104 2 INFO nova.compute.manager [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.105 2 DEBUG oslo.service.loopingcall [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.105 2 DEBUG nova.compute.manager [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.105 2 DEBUG nova.network.neutron [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:40:37 np0005470441 nova_compute[192626]: 2025-10-04 05:40:37.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:38 np0005470441 podman[226985]: 2025-10-04 05:40:38.292499974 +0000 UTC m=+0.049831407 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:40:38 np0005470441 nova_compute[192626]: 2025-10-04 05:40:38.807 2 DEBUG nova.network.neutron [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updated VIF entry in instance network info cache for port ea5d5ce9-b8c0-45ae-8462-bfa1288280c9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:40:38 np0005470441 nova_compute[192626]: 2025-10-04 05:40:38.808 2 DEBUG nova.network.neutron [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updating instance_info_cache with network_info: [{"id": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "address": "fa:16:3e:94:6c:37", "network": {"id": "f1116f56-9520-48d8-8bb2-2519f97b3338", "bridge": "br-int", "label": "tempest-network-smoke--1086735460", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea5d5ce9-b8", "ovs_interfaceid": "ea5d5ce9-b8c0-45ae-8462-bfa1288280c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:38 np0005470441 nova_compute[192626]: 2025-10-04 05:40:38.864 2 DEBUG oslo_concurrency.lockutils [req-28ca9c48-9215-419e-b944-871069cf83ed req-7b72ce31-dbda-4cce-92eb-1edeb1177c4d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.274 2 DEBUG nova.network.neutron [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.327 2 INFO nova.compute.manager [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Took 2.22 seconds to deallocate network for instance.#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.369 2 DEBUG nova.compute.manager [req-cd71c1ec-0a1c-4b1b-a94e-41abbecdd14f req-5ec469f2-8974-4aa3-ba89-d36d5b63576a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Received event network-vif-deleted-ea5d5ce9-b8c0-45ae-8462-bfa1288280c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.426 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.427 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.487 2 DEBUG nova.compute.provider_tree [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.509 2 DEBUG nova.scheduler.client.report [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.562 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.586 2 INFO nova.scheduler.client.report [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Deleted allocations for instance 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016#033[00m
Oct  4 01:40:39 np0005470441 nova_compute[192626]: 2025-10-04 05:40:39.671 2 DEBUG oslo_concurrency.lockutils [None req-019c3c83-9a3e-4a52-9f81-de9f90c395c9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.996s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:41 np0005470441 podman[227004]: 2025-10-04 05:40:41.373495967 +0000 UTC m=+0.113186319 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:40:41 np0005470441 nova_compute[192626]: 2025-10-04 05:40:41.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:42 np0005470441 nova_compute[192626]: 2025-10-04 05:40:42.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.567 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.568 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.596 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.745 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.745 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.752 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.752 2 INFO nova.compute.claims [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.869 2 DEBUG nova.compute.provider_tree [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.896 2 DEBUG nova.scheduler.client.report [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.919 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.920 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.992 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:40:43 np0005470441 nova_compute[192626]: 2025-10-04 05:40:43.993 2 DEBUG nova.network.neutron [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.042 2 INFO nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.069 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.190 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.191 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.191 2 INFO nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Creating image(s)#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.192 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.192 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.193 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.205 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.263 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.264 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.265 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.276 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.311 2 DEBUG nova.policy [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.332 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.333 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.381 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.382 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.382 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.436 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.437 2 DEBUG nova.virt.disk.api [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.438 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.492 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.493 2 DEBUG nova.virt.disk.api [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.494 2 DEBUG nova.objects.instance [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.509 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.510 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Ensure instance console log exists: /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.510 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.511 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:44 np0005470441 nova_compute[192626]: 2025-10-04 05:40:44.511 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:46 np0005470441 nova_compute[192626]: 2025-10-04 05:40:46.308 2 DEBUG nova.network.neutron [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Successfully created port: c638bfcb-e144-4a6d-9626-9ae28c1a6437 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:40:46 np0005470441 nova_compute[192626]: 2025-10-04 05:40:46.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:47 np0005470441 nova_compute[192626]: 2025-10-04 05:40:47.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:48 np0005470441 nova_compute[192626]: 2025-10-04 05:40:48.315 2 DEBUG nova.network.neutron [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Successfully updated port: c638bfcb-e144-4a6d-9626-9ae28c1a6437 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:40:48 np0005470441 nova_compute[192626]: 2025-10-04 05:40:48.358 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:40:48 np0005470441 nova_compute[192626]: 2025-10-04 05:40:48.358 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:40:48 np0005470441 nova_compute[192626]: 2025-10-04 05:40:48.359 2 DEBUG nova.network.neutron [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:40:48 np0005470441 nova_compute[192626]: 2025-10-04 05:40:48.584 2 DEBUG nova.network.neutron [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:40:50 np0005470441 nova_compute[192626]: 2025-10-04 05:40:50.065 2 DEBUG nova.compute.manager [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-changed-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:50 np0005470441 nova_compute[192626]: 2025-10-04 05:40:50.065 2 DEBUG nova.compute.manager [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing instance network info cache due to event network-changed-c638bfcb-e144-4a6d-9626-9ae28c1a6437. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:40:50 np0005470441 nova_compute[192626]: 2025-10-04 05:40:50.066 2 DEBUG oslo_concurrency.lockutils [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:40:50 np0005470441 podman[227049]: 2025-10-04 05:40:50.291876888 +0000 UTC m=+0.049882049 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:40:50 np0005470441 podman[227050]: 2025-10-04 05:40:50.292496665 +0000 UTC m=+0.047271374 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:40:50 np0005470441 nova_compute[192626]: 2025-10-04 05:40:50.986 2 DEBUG nova.network.neutron [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.007 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.007 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Instance network_info: |[{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.007 2 DEBUG oslo_concurrency.lockutils [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.008 2 DEBUG nova.network.neutron [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing network info cache for port c638bfcb-e144-4a6d-9626-9ae28c1a6437 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.012 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Start _get_guest_xml network_info=[{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.017 2 WARNING nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.027 2 DEBUG nova.virt.libvirt.host [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.029 2 DEBUG nova.virt.libvirt.host [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.036 2 DEBUG nova.virt.libvirt.host [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.037 2 DEBUG nova.virt.libvirt.host [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.038 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.038 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.038 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.039 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.039 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.039 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.039 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.039 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.040 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.040 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.040 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.040 2 DEBUG nova.virt.hardware [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.043 2 DEBUG nova.virt.libvirt.vif [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:40:44Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.043 2 DEBUG nova.network.os_vif_util [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.044 2 DEBUG nova.network.os_vif_util [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.045 2 DEBUG nova.objects.instance [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.063 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <uuid>dfa10e04-6283-4c0a-94b0-6b4841e55401</uuid>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <name>instance-0000001f</name>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:40:51</nova:creationTime>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <entry name="serial">dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <entry name="uuid">dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:37:07:15"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <target dev="tapc638bfcb-e1"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log" append="off"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:40:51 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:40:51 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:40:51 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:40:51 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.064 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Preparing to wait for external event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.064 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.065 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.065 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.065 2 DEBUG nova.virt.libvirt.vif [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:40:44Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.066 2 DEBUG nova.network.os_vif_util [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.066 2 DEBUG nova.network.os_vif_util [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.066 2 DEBUG os_vif [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.067 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.068 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc638bfcb-e1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.070 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc638bfcb-e1, col_values=(('external_ids', {'iface-id': 'c638bfcb-e144-4a6d-9626-9ae28c1a6437', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:07:15', 'vm-uuid': 'dfa10e04-6283-4c0a-94b0-6b4841e55401'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:51 np0005470441 NetworkManager[51690]: <info>  [1759556451.1205] manager: (tapc638bfcb-e1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.126 2 INFO os_vif [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1')#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.192 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.193 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.193 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:37:07:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.193 2 INFO nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Using config drive#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.671 2 INFO nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Creating config drive at /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.676 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6z04nxi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.800 2 DEBUG oslo_concurrency.processutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6z04nxi" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:40:51 np0005470441 kernel: tapc638bfcb-e1: entered promiscuous mode
Oct  4 01:40:51 np0005470441 NetworkManager[51690]: <info>  [1759556451.8690] manager: (tapc638bfcb-e1): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Oct  4 01:40:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:51Z|00231|binding|INFO|Claiming lport c638bfcb-e144-4a6d-9626-9ae28c1a6437 for this chassis.
Oct  4 01:40:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:51Z|00232|binding|INFO|c638bfcb-e144-4a6d-9626-9ae28c1a6437: Claiming fa:16:3e:37:07:15 10.100.0.8
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 systemd-udevd[227112]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.895 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:07:15 10.100.0.8'], port_security=['fa:16:3e:37:07:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aa9609d0-80fa-4d9b-8c8b-3ac9c8b42178', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aa7a6f0-a5c6-4212-9ae8-0a0cc5360bf3, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=c638bfcb-e144-4a6d-9626-9ae28c1a6437) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.896 103689 INFO neutron.agent.ovn.metadata.agent [-] Port c638bfcb-e144-4a6d-9626-9ae28c1a6437 in datapath a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 bound to our chassis#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.897 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9#033[00m
Oct  4 01:40:51 np0005470441 NetworkManager[51690]: <info>  [1759556451.9073] device (tapc638bfcb-e1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.907 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[59d24d32-114c-4b54-8f8c-bad014a9a2e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.908 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1a4f623-11 in ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.910 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1a4f623-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.910 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e9cd6622-72b4-4111-8c07-a980d034c869]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.910 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[52e9e075-6d5e-4e75-b602-60f88a38b238]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 NetworkManager[51690]: <info>  [1759556451.9112] device (tapc638bfcb-e1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:40:51 np0005470441 systemd-machined[152624]: New machine qemu-17-instance-0000001f.
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.921 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[0d08b8c1-42ad-4600-984f-f2900ac33b04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:51Z|00233|binding|INFO|Setting lport c638bfcb-e144-4a6d-9626-9ae28c1a6437 ovn-installed in OVS
Oct  4 01:40:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:51Z|00234|binding|INFO|Setting lport c638bfcb-e144-4a6d-9626-9ae28c1a6437 up in Southbound
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:51 np0005470441 systemd[1]: Started Virtual Machine qemu-17-instance-0000001f.
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.935 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556436.934596, 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.936 2 INFO nova.compute.manager [-] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.945 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d5cb3b83-2b30-4e55-9299-225d5bdfa5c4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.975 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1799f828-1c4c-463b-a35f-709be98e056f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:51.980 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ec286718-0ec1-4c26-979d-f1c4b03af22e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:51 np0005470441 NetworkManager[51690]: <info>  [1759556451.9809] manager: (tapa1a4f623-10): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Oct  4 01:40:51 np0005470441 nova_compute[192626]: 2025-10-04 05:40:51.998 2 DEBUG nova.compute.manager [None req-a9e5bb20-0ea6-4239-aad4-744aea40bfeb - - - - - -] [instance: 1bfee9bd-0c61-4eb5-b1e7-2bcb65ed9016] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.015 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[698eb8a4-cf4b-4bf3-99e4-38b1640ce1fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.018 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[03113056-37f3-408d-8b82-7757fd10ebf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 NetworkManager[51690]: <info>  [1759556452.0372] device (tapa1a4f623-10): carrier: link connected
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.042 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[9357d03b-6f6b-4f01-9f5a-899acbab6f9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.057 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e65dc696-c7c2-4b69-830a-aca4663d9e35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1a4f623-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b7:93:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434969, 'reachable_time': 15658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227146, 'error': None, 'target': 'ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.069 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b848bf48-a9d7-4171-ab15-3f0c62147308]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb7:934c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434969, 'tstamp': 434969}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227147, 'error': None, 'target': 'ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.084 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f2608d95-bef4-470b-9f39-a7f2caa94ed3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1a4f623-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b7:93:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434969, 'reachable_time': 15658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227148, 'error': None, 'target': 'ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.111 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ba5fa68a-45b7-4319-aa26-86d81f2c7524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.161 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[50a2f90e-3a81-4185-83c9-30162a3708c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.163 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1a4f623-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.163 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.163 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1a4f623-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:52 np0005470441 NetworkManager[51690]: <info>  [1759556452.2147] manager: (tapa1a4f623-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Oct  4 01:40:52 np0005470441 kernel: tapa1a4f623-10: entered promiscuous mode
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.219 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1a4f623-10, col_values=(('external_ids', {'iface-id': '224f2340-bf1e-48df-8648-a854ae221536'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:52Z|00235|binding|INFO|Releasing lport 224f2340-bf1e-48df-8648-a854ae221536 from this chassis (sb_readonly=0)
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.222 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.222 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[75faa0fd-5151-40bf-9603-bce1757397bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.223 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9.pid.haproxy
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:40:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:40:52.224 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'env', 'PROCESS_TAG=haproxy-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:52 np0005470441 podman[227187]: 2025-10-04 05:40:52.566888081 +0000 UTC m=+0.052934525 container create 73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:40:52 np0005470441 systemd[1]: Started libpod-conmon-73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550.scope.
Oct  4 01:40:52 np0005470441 podman[227187]: 2025-10-04 05:40:52.536122657 +0000 UTC m=+0.022169201 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:40:52 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:40:52 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f63165a507e8a7e46f165a9d636c347c68924094d5be7a88cbf471d6239fcab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.663 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556452.662744, dfa10e04-6283-4c0a-94b0-6b4841e55401 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.664 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] VM Started (Lifecycle Event)#033[00m
Oct  4 01:40:52 np0005470441 podman[227187]: 2025-10-04 05:40:52.674344786 +0000 UTC m=+0.160391260 container init 73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  4 01:40:52 np0005470441 podman[227187]: 2025-10-04 05:40:52.680780028 +0000 UTC m=+0.166826472 container start 73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.682 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.686 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556452.6630046, dfa10e04-6283-4c0a-94b0-6b4841e55401 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.687 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:52 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [NOTICE]   (227207) : New worker (227209) forked
Oct  4 01:40:52 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [NOTICE]   (227207) : Loading success.
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.710 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.715 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:40:52 np0005470441 nova_compute[192626]: 2025-10-04 05:40:52.748 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:40:53 np0005470441 nova_compute[192626]: 2025-10-04 05:40:53.650 2 DEBUG nova.network.neutron [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updated VIF entry in instance network info cache for port c638bfcb-e144-4a6d-9626-9ae28c1a6437. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:40:53 np0005470441 nova_compute[192626]: 2025-10-04 05:40:53.651 2 DEBUG nova.network.neutron [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:40:53 np0005470441 nova_compute[192626]: 2025-10-04 05:40:53.669 2 DEBUG oslo_concurrency.lockutils [req-fe9c100b-6475-4f42-a6fc-1dc29d603e74 req-c1d9dafb-16d5-43d0-b886-807636148274 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.357 2 DEBUG nova.compute.manager [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.358 2 DEBUG oslo_concurrency.lockutils [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.359 2 DEBUG oslo_concurrency.lockutils [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.359 2 DEBUG oslo_concurrency.lockutils [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.359 2 DEBUG nova.compute.manager [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Processing event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.360 2 DEBUG nova.compute.manager [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.360 2 DEBUG oslo_concurrency.lockutils [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.360 2 DEBUG oslo_concurrency.lockutils [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.361 2 DEBUG oslo_concurrency.lockutils [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.361 2 DEBUG nova.compute.manager [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.362 2 WARNING nova.compute.manager [req-c1fcba4a-51f1-4ff1-b844-ddcc3cf800d8 req-87547dc7-eec0-4496-b334-da215d2ce68d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received unexpected event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 for instance with vm_state building and task_state spawning.#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.362 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.366 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556454.3659213, dfa10e04-6283-4c0a-94b0-6b4841e55401 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.366 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.368 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.373 2 INFO nova.virt.libvirt.driver [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Instance spawned successfully.#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.374 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.400 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.404 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.405 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.405 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.406 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.406 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.406 2 DEBUG nova.virt.libvirt.driver [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.410 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.446 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.487 2 INFO nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Took 10.30 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.488 2 DEBUG nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.556 2 INFO nova.compute.manager [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Took 10.89 seconds to build instance.#033[00m
Oct  4 01:40:54 np0005470441 nova_compute[192626]: 2025-10-04 05:40:54.573 2 DEBUG oslo_concurrency.lockutils [None req-371951c0-fa31-4514-a606-e00a491bf5e4 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:40:55 np0005470441 podman[227219]: 2025-10-04 05:40:55.317660297 +0000 UTC m=+0.062606230 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:40:55 np0005470441 podman[227218]: 2025-10-04 05:40:55.331273344 +0000 UTC m=+0.084247355 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:40:56 np0005470441 nova_compute[192626]: 2025-10-04 05:40:56.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:57 np0005470441 nova_compute[192626]: 2025-10-04 05:40:57.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:59 np0005470441 nova_compute[192626]: 2025-10-04 05:40:59.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:59 np0005470441 NetworkManager[51690]: <info>  [1759556459.5995] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Oct  4 01:40:59 np0005470441 NetworkManager[51690]: <info>  [1759556459.6004] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Oct  4 01:40:59 np0005470441 ovn_controller[94840]: 2025-10-04T05:40:59Z|00236|binding|INFO|Releasing lport 224f2340-bf1e-48df-8648-a854ae221536 from this chassis (sb_readonly=0)
Oct  4 01:40:59 np0005470441 nova_compute[192626]: 2025-10-04 05:40:59.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:59 np0005470441 nova_compute[192626]: 2025-10-04 05:40:59.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:40:59 np0005470441 nova_compute[192626]: 2025-10-04 05:40:59.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:00 np0005470441 nova_compute[192626]: 2025-10-04 05:41:00.121 2 DEBUG nova.compute.manager [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-changed-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:00 np0005470441 nova_compute[192626]: 2025-10-04 05:41:00.121 2 DEBUG nova.compute.manager [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing instance network info cache due to event network-changed-c638bfcb-e144-4a6d-9626-9ae28c1a6437. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:41:00 np0005470441 nova_compute[192626]: 2025-10-04 05:41:00.121 2 DEBUG oslo_concurrency.lockutils [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:41:00 np0005470441 nova_compute[192626]: 2025-10-04 05:41:00.122 2 DEBUG oslo_concurrency.lockutils [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:41:00 np0005470441 nova_compute[192626]: 2025-10-04 05:41:00.122 2 DEBUG nova.network.neutron [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing network info cache for port c638bfcb-e144-4a6d-9626-9ae28c1a6437 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:41:01 np0005470441 nova_compute[192626]: 2025-10-04 05:41:01.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:02 np0005470441 nova_compute[192626]: 2025-10-04 05:41:02.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:02 np0005470441 nova_compute[192626]: 2025-10-04 05:41:02.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:02 np0005470441 nova_compute[192626]: 2025-10-04 05:41:02.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:02 np0005470441 nova_compute[192626]: 2025-10-04 05:41:02.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:41:03 np0005470441 podman[227256]: 2025-10-04 05:41:03.319643672 +0000 UTC m=+0.072476561 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git)
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.742 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.744 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.838 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.903 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.904 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:03 np0005470441 nova_compute[192626]: 2025-10-04 05:41:03.967 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.138 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.140 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5598MB free_disk=73.42854309082031GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.140 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.141 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.236 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance dfa10e04-6283-4c0a-94b0-6b4841e55401 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.237 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.237 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.281 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.323 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.356 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.358 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.778 2 DEBUG nova.network.neutron [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updated VIF entry in instance network info cache for port c638bfcb-e144-4a6d-9626-9ae28c1a6437. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.779 2 DEBUG nova.network.neutron [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:41:04 np0005470441 nova_compute[192626]: 2025-10-04 05:41:04.809 2 DEBUG oslo_concurrency.lockutils [req-f50b0daf-6f7d-4f27-98de-1b0e999e4b28 req-2fdf5d08-33bd-47f5-9b97-6fa004940d0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:41:05 np0005470441 nova_compute[192626]: 2025-10-04 05:41:05.358 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:05 np0005470441 nova_compute[192626]: 2025-10-04 05:41:05.359 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:41:05 np0005470441 nova_compute[192626]: 2025-10-04 05:41:05.382 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:41:05 np0005470441 nova_compute[192626]: 2025-10-04 05:41:05.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:06 np0005470441 nova_compute[192626]: 2025-10-04 05:41:06.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:06 np0005470441 nova_compute[192626]: 2025-10-04 05:41:06.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:06 np0005470441 nova_compute[192626]: 2025-10-04 05:41:06.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:06.748 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:06.750 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:06.750 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:07 np0005470441 podman[227301]: 2025-10-04 05:41:07.323408135 +0000 UTC m=+0.067975563 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:41:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:07Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:37:07:15 10.100.0.8
Oct  4 01:41:07 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:07Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:37:07:15 10.100.0.8
Oct  4 01:41:07 np0005470441 nova_compute[192626]: 2025-10-04 05:41:07.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:07 np0005470441 nova_compute[192626]: 2025-10-04 05:41:07.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:09 np0005470441 podman[227325]: 2025-10-04 05:41:09.30958959 +0000 UTC m=+0.062358873 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:41:09 np0005470441 nova_compute[192626]: 2025-10-04 05:41:09.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:11 np0005470441 nova_compute[192626]: 2025-10-04 05:41:11.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:12 np0005470441 podman[227345]: 2025-10-04 05:41:12.382804208 +0000 UTC m=+0.126069654 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:41:12 np0005470441 nova_compute[192626]: 2025-10-04 05:41:12.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:13 np0005470441 nova_compute[192626]: 2025-10-04 05:41:13.064 2 INFO nova.compute.manager [None req-67e6b9f3-a15d-4d5d-b35f-8a397eb6cb74 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Get console output#033[00m
Oct  4 01:41:13 np0005470441 nova_compute[192626]: 2025-10-04 05:41:13.071 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:41:13 np0005470441 nova_compute[192626]: 2025-10-04 05:41:13.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:41:16 np0005470441 nova_compute[192626]: 2025-10-04 05:41:16.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:17 np0005470441 nova_compute[192626]: 2025-10-04 05:41:17.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:18 np0005470441 nova_compute[192626]: 2025-10-04 05:41:18.966 2 DEBUG oslo_concurrency.lockutils [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "interface-dfa10e04-6283-4c0a-94b0-6b4841e55401-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:18 np0005470441 nova_compute[192626]: 2025-10-04 05:41:18.967 2 DEBUG oslo_concurrency.lockutils [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "interface-dfa10e04-6283-4c0a-94b0-6b4841e55401-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:18 np0005470441 nova_compute[192626]: 2025-10-04 05:41:18.967 2 DEBUG nova.objects.instance [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'flavor' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:41:20 np0005470441 nova_compute[192626]: 2025-10-04 05:41:20.012 2 DEBUG nova.objects.instance [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_requests' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:41:20 np0005470441 nova_compute[192626]: 2025-10-04 05:41:20.032 2 DEBUG nova.network.neutron [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:41:21 np0005470441 nova_compute[192626]: 2025-10-04 05:41:21.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:21 np0005470441 nova_compute[192626]: 2025-10-04 05:41:21.198 2 DEBUG nova.policy [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:41:21 np0005470441 podman[227372]: 2025-10-04 05:41:21.321642183 +0000 UTC m=+0.061157449 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:41:21 np0005470441 podman[227371]: 2025-10-04 05:41:21.326283045 +0000 UTC m=+0.079029547 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:41:22 np0005470441 nova_compute[192626]: 2025-10-04 05:41:22.132 2 DEBUG nova.network.neutron [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Successfully created port: 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:41:22 np0005470441 nova_compute[192626]: 2025-10-04 05:41:22.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:22.725 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:41:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:22.726 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:41:22 np0005470441 nova_compute[192626]: 2025-10-04 05:41:22.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.277 2 DEBUG nova.network.neutron [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Successfully updated port: 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.297 2 DEBUG oslo_concurrency.lockutils [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.297 2 DEBUG oslo_concurrency.lockutils [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.298 2 DEBUG nova.network.neutron [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.404 2 DEBUG nova.compute.manager [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-changed-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.404 2 DEBUG nova.compute.manager [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing instance network info cache due to event network-changed-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:41:23 np0005470441 nova_compute[192626]: 2025-10-04 05:41:23.405 2 DEBUG oslo_concurrency.lockutils [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.822 2 DEBUG nova.network.neutron [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.849 2 DEBUG oslo_concurrency.lockutils [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.851 2 DEBUG oslo_concurrency.lockutils [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.852 2 DEBUG nova.network.neutron [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing network info cache for port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.856 2 DEBUG nova.virt.libvirt.vif [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.856 2 DEBUG nova.network.os_vif_util [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.857 2 DEBUG nova.network.os_vif_util [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.858 2 DEBUG os_vif [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.860 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap97c6d65d-ad, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.866 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap97c6d65d-ad, col_values=(('external_ids', {'iface-id': '97c6d65d-ad9d-4e43-a41c-301d4df8ccdd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:03:22', 'vm-uuid': 'dfa10e04-6283-4c0a-94b0-6b4841e55401'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 NetworkManager[51690]: <info>  [1759556485.8696] manager: (tap97c6d65d-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.879 2 INFO os_vif [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad')#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.880 2 DEBUG nova.virt.libvirt.vif [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.881 2 DEBUG nova.network.os_vif_util [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.882 2 DEBUG nova.network.os_vif_util [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.885 2 DEBUG nova.virt.libvirt.guest [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] attach device xml: <interface type="ethernet">
Oct  4 01:41:25 np0005470441 nova_compute[192626]:  <mac address="fa:16:3e:0d:03:22"/>
Oct  4 01:41:25 np0005470441 nova_compute[192626]:  <model type="virtio"/>
Oct  4 01:41:25 np0005470441 nova_compute[192626]:  <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:41:25 np0005470441 nova_compute[192626]:  <mtu size="1442"/>
Oct  4 01:41:25 np0005470441 nova_compute[192626]:  <target dev="tap97c6d65d-ad"/>
Oct  4 01:41:25 np0005470441 nova_compute[192626]: </interface>
Oct  4 01:41:25 np0005470441 nova_compute[192626]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  4 01:41:25 np0005470441 kernel: tap97c6d65d-ad: entered promiscuous mode
Oct  4 01:41:25 np0005470441 NetworkManager[51690]: <info>  [1759556485.9006] manager: (tap97c6d65d-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:25Z|00237|binding|INFO|Claiming lport 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd for this chassis.
Oct  4 01:41:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:25Z|00238|binding|INFO|97c6d65d-ad9d-4e43-a41c-301d4df8ccdd: Claiming fa:16:3e:0d:03:22 10.100.0.24
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.919 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:03:22 10.100.0.24'], port_security=['fa:16:3e:0d:03:22 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'af6e9f85-2a38-4738-916c-2544b03b6a65', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=841d06f9-5fd2-480b-a960-98e4de1665c7, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.920 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd in datapath d2bb6534-edcf-4f06-bd53-7102e28ef382 bound to our chassis#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.922 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2bb6534-edcf-4f06-bd53-7102e28ef382#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.935 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[626e4265-b664-45d7-8497-c447c06122bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.935 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd2bb6534-e1 in ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.937 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd2bb6534-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.937 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4d28d8-f472-4e9f-a87f-b6588226fbbf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.938 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3577c8d6-24e9-478a-b982-091b788c1c3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.950 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[6f3c1786-ab6c-4b34-b18b-62804823f5e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:25Z|00239|binding|INFO|Setting lport 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd ovn-installed in OVS
Oct  4 01:41:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:25Z|00240|binding|INFO|Setting lport 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd up in Southbound
Oct  4 01:41:25 np0005470441 nova_compute[192626]: 2025-10-04 05:41:25.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:25.964 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f053423-71aa-47f5-81d3-b5e0d1f9c63b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:25 np0005470441 systemd-udevd[227448]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:41:25 np0005470441 NetworkManager[51690]: <info>  [1759556485.9923] device (tap97c6d65d-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:41:25 np0005470441 NetworkManager[51690]: <info>  [1759556485.9933] device (tap97c6d65d-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.000 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ef755af7-01b4-4d36-8dbf-a76339f536db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 NetworkManager[51690]: <info>  [1759556486.0049] manager: (tapd2bb6534-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.006 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f083ebd2-479b-4d34-a2e8-d487624dbb1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.013 2 DEBUG nova.virt.libvirt.driver [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.014 2 DEBUG nova.virt.libvirt.driver [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.014 2 DEBUG nova.virt.libvirt.driver [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:37:07:15, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.014 2 DEBUG nova.virt.libvirt.driver [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:0d:03:22, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:41:26 np0005470441 podman[227421]: 2025-10-04 05:41:26.020904882 +0000 UTC m=+0.078641736 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Oct  4 01:41:26 np0005470441 podman[227419]: 2025-10-04 05:41:26.027677674 +0000 UTC m=+0.090339538 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid)
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.036 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ea482d-e73b-456a-bfac-5471bf534284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.039 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[b64ea137-2dd6-4562-a24c-374f7048e790]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.048 2 DEBUG nova.virt.libvirt.guest [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:41:26</nova:creationTime>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:41:26 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    <nova:port uuid="97c6d65d-ad9d-4e43-a41c-301d4df8ccdd">
Oct  4 01:41:26 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:41:26 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:41:26 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:41:26 np0005470441 nova_compute[192626]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  4 01:41:26 np0005470441 NetworkManager[51690]: <info>  [1759556486.0624] device (tapd2bb6534-e0): carrier: link connected
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.068 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[947f7ce0-f8a8-4550-8a07-7b57735d9b7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.080 2 DEBUG oslo_concurrency.lockutils [None req-dce7104a-7dd3-47d9-9380-888216d86aeb b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "interface-dfa10e04-6283-4c0a-94b0-6b4841e55401-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.083 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d5916109-b61e-4612-b7c3-a010cf48c039]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2bb6534-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:78:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438371, 'reachable_time': 39720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227488, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.101 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[859515fc-a2c9-42ac-9449-28a1e8b4bdf2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:7841'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438371, 'tstamp': 438371}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227489, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.119 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[17fb5364-0f90-4531-ab81-c12ee492c0b2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2bb6534-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:78:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438371, 'reachable_time': 39720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227490, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.151 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5c161222-15e4-441b-9141-68198b3b4ecb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.209 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[70de8696-49f1-443f-8556-0fba33e08f92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.210 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2bb6534-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.211 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.211 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2bb6534-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:26 np0005470441 kernel: tapd2bb6534-e0: entered promiscuous mode
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:26 np0005470441 NetworkManager[51690]: <info>  [1759556486.2146] manager: (tapd2bb6534-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.217 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2bb6534-e0, col_values=(('external_ids', {'iface-id': '8bf48ff2-841a-4857-8f05-e5e03ecacf8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:26 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:26Z|00241|binding|INFO|Releasing lport 8bf48ff2-841a-4857-8f05-e5e03ecacf8c from this chassis (sb_readonly=0)
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.219 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d2bb6534-edcf-4f06-bd53-7102e28ef382.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d2bb6534-edcf-4f06-bd53-7102e28ef382.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.219 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[609ef743-6673-4936-8692-7fb674f10a60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.220 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-d2bb6534-edcf-4f06-bd53-7102e28ef382
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/d2bb6534-edcf-4f06-bd53-7102e28ef382.pid.haproxy
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID d2bb6534-edcf-4f06-bd53-7102e28ef382
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:41:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:26.220 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'env', 'PROCESS_TAG=haproxy-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d2bb6534-edcf-4f06-bd53-7102e28ef382.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.383 2 DEBUG nova.compute.manager [req-4e0eb080-b58d-46d4-9d4f-2c5e0f5ca9f4 req-3f36c102-557f-4077-979c-2a3ff0b4be59 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.383 2 DEBUG oslo_concurrency.lockutils [req-4e0eb080-b58d-46d4-9d4f-2c5e0f5ca9f4 req-3f36c102-557f-4077-979c-2a3ff0b4be59 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.383 2 DEBUG oslo_concurrency.lockutils [req-4e0eb080-b58d-46d4-9d4f-2c5e0f5ca9f4 req-3f36c102-557f-4077-979c-2a3ff0b4be59 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.384 2 DEBUG oslo_concurrency.lockutils [req-4e0eb080-b58d-46d4-9d4f-2c5e0f5ca9f4 req-3f36c102-557f-4077-979c-2a3ff0b4be59 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.384 2 DEBUG nova.compute.manager [req-4e0eb080-b58d-46d4-9d4f-2c5e0f5ca9f4 req-3f36c102-557f-4077-979c-2a3ff0b4be59 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:41:26 np0005470441 nova_compute[192626]: 2025-10-04 05:41:26.384 2 WARNING nova.compute.manager [req-4e0eb080-b58d-46d4-9d4f-2c5e0f5ca9f4 req-3f36c102-557f-4077-979c-2a3ff0b4be59 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received unexpected event network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd for instance with vm_state active and task_state None.#033[00m
Oct  4 01:41:26 np0005470441 podman[227522]: 2025-10-04 05:41:26.547388026 +0000 UTC m=+0.048749447 container create 357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:41:26 np0005470441 systemd[1]: Started libpod-conmon-357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1.scope.
Oct  4 01:41:26 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:41:26 np0005470441 podman[227522]: 2025-10-04 05:41:26.521997614 +0000 UTC m=+0.023359045 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:41:26 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/475e75a1610e0398cddae0a1fde46617db6c36c5ce83f95b3767f87bba5497e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:41:26 np0005470441 podman[227522]: 2025-10-04 05:41:26.636459237 +0000 UTC m=+0.137820668 container init 357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:41:26 np0005470441 podman[227522]: 2025-10-04 05:41:26.645225316 +0000 UTC m=+0.146586727 container start 357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true)
Oct  4 01:41:26 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [NOTICE]   (227541) : New worker (227543) forked
Oct  4 01:41:26 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [NOTICE]   (227541) : Loading success.
Oct  4 01:41:27 np0005470441 nova_compute[192626]: 2025-10-04 05:41:27.376 2 DEBUG nova.network.neutron [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updated VIF entry in instance network info cache for port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:41:27 np0005470441 nova_compute[192626]: 2025-10-04 05:41:27.377 2 DEBUG nova.network.neutron [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:41:27 np0005470441 nova_compute[192626]: 2025-10-04 05:41:27.396 2 DEBUG oslo_concurrency.lockutils [req-7b2057ca-79d5-4245-ba43-35098db42d92 req-46011ea8-64d4-4488-bbc4-52e66f8635ec 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:41:27 np0005470441 nova_compute[192626]: 2025-10-04 05:41:27.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:28 np0005470441 nova_compute[192626]: 2025-10-04 05:41:28.536 2 DEBUG nova.compute.manager [req-26b63002-039f-46e8-b687-bcccf8a9fdd9 req-b83ec57a-9d72-4eba-9179-23dfb97ac3fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:28 np0005470441 nova_compute[192626]: 2025-10-04 05:41:28.536 2 DEBUG oslo_concurrency.lockutils [req-26b63002-039f-46e8-b687-bcccf8a9fdd9 req-b83ec57a-9d72-4eba-9179-23dfb97ac3fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:28 np0005470441 nova_compute[192626]: 2025-10-04 05:41:28.536 2 DEBUG oslo_concurrency.lockutils [req-26b63002-039f-46e8-b687-bcccf8a9fdd9 req-b83ec57a-9d72-4eba-9179-23dfb97ac3fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:28 np0005470441 nova_compute[192626]: 2025-10-04 05:41:28.536 2 DEBUG oslo_concurrency.lockutils [req-26b63002-039f-46e8-b687-bcccf8a9fdd9 req-b83ec57a-9d72-4eba-9179-23dfb97ac3fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:28 np0005470441 nova_compute[192626]: 2025-10-04 05:41:28.537 2 DEBUG nova.compute.manager [req-26b63002-039f-46e8-b687-bcccf8a9fdd9 req-b83ec57a-9d72-4eba-9179-23dfb97ac3fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:41:28 np0005470441 nova_compute[192626]: 2025-10-04 05:41:28.537 2 WARNING nova.compute.manager [req-26b63002-039f-46e8-b687-bcccf8a9fdd9 req-b83ec57a-9d72-4eba-9179-23dfb97ac3fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received unexpected event network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd for instance with vm_state active and task_state None.#033[00m
Oct  4 01:41:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:28Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:03:22 10.100.0.24
Oct  4 01:41:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:28Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:03:22 10.100.0.24
Oct  4 01:41:29 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:29.729 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:30 np0005470441 nova_compute[192626]: 2025-10-04 05:41:30.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.102 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.103 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.150 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.258 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.258 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.268 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.269 2 INFO nova.compute.claims [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.405 2 DEBUG nova.compute.provider_tree [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.420 2 DEBUG nova.scheduler.client.report [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.440 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.441 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.489 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.490 2 DEBUG nova.network.neutron [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.517 2 INFO nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.546 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.709 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.712 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.712 2 INFO nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Creating image(s)#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.713 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.714 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.715 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.743 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.766 2 DEBUG nova.policy [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.828 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.829 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.829 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.841 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.907 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.908 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.948 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.949 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:32 np0005470441 nova_compute[192626]: 2025-10-04 05:41:32.950 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.018 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.019 2 DEBUG nova.virt.disk.api [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.019 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.084 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.086 2 DEBUG nova.virt.disk.api [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.087 2 DEBUG nova.objects.instance [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 9656898e-1d93-434d-88db-975744a112d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.117 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.118 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Ensure instance console log exists: /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.119 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.120 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.120 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:33 np0005470441 nova_compute[192626]: 2025-10-04 05:41:33.506 2 DEBUG nova.network.neutron [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Successfully created port: 7f499c00-2230-423e-81b3-14ba642bb436 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:41:34 np0005470441 podman[227567]: 2025-10-04 05:41:34.324381305 +0000 UTC m=+0.074533269 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal)
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.347 2 DEBUG nova.network.neutron [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Successfully updated port: 7f499c00-2230-423e-81b3-14ba642bb436 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.368 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-9656898e-1d93-434d-88db-975744a112d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.368 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-9656898e-1d93-434d-88db-975744a112d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.368 2 DEBUG nova.network.neutron [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.453 2 DEBUG nova.compute.manager [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-changed-7f499c00-2230-423e-81b3-14ba642bb436 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.454 2 DEBUG nova.compute.manager [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Refreshing instance network info cache due to event network-changed-7f499c00-2230-423e-81b3-14ba642bb436. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.454 2 DEBUG oslo_concurrency.lockutils [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-9656898e-1d93-434d-88db-975744a112d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:41:34 np0005470441 nova_compute[192626]: 2025-10-04 05:41:34.531 2 DEBUG nova.network.neutron [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:41:35 np0005470441 nova_compute[192626]: 2025-10-04 05:41:35.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.756 2 DEBUG nova.network.neutron [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Updating instance_info_cache with network_info: [{"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.784 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-9656898e-1d93-434d-88db-975744a112d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.785 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Instance network_info: |[{"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.785 2 DEBUG oslo_concurrency.lockutils [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-9656898e-1d93-434d-88db-975744a112d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.786 2 DEBUG nova.network.neutron [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Refreshing network info cache for port 7f499c00-2230-423e-81b3-14ba642bb436 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.789 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Start _get_guest_xml network_info=[{"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.794 2 WARNING nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.800 2 DEBUG nova.virt.libvirt.host [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.800 2 DEBUG nova.virt.libvirt.host [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.804 2 DEBUG nova.virt.libvirt.host [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.805 2 DEBUG nova.virt.libvirt.host [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.807 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.807 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.808 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.808 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.808 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.809 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.809 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.809 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.809 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.810 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.810 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.810 2 DEBUG nova.virt.hardware [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.815 2 DEBUG nova.virt.libvirt.vif [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-373575841',display_name='tempest-TestNetworkBasicOps-server-373575841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-373575841',id=33,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMV4i3QrHLAODynicG5S7qXswKZBD7kSDxtdfVgro6wiuyOowbZQhzzd+gfAwKPyX+f6hoJ8rK8zgFtCqqftJoCldF3OXHWXCtpI3QNnw/GgFwuWJpq4kkVwJAG+UEZpYw==',key_name='tempest-TestNetworkBasicOps-1996767979',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-2xbvrdyf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:41:32Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=9656898e-1d93-434d-88db-975744a112d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.815 2 DEBUG nova.network.os_vif_util [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.816 2 DEBUG nova.network.os_vif_util [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.818 2 DEBUG nova.objects.instance [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9656898e-1d93-434d-88db-975744a112d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.834 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <uuid>9656898e-1d93-434d-88db-975744a112d3</uuid>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <name>instance-00000021</name>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-373575841</nova:name>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:41:37</nova:creationTime>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        <nova:port uuid="7f499c00-2230-423e-81b3-14ba642bb436">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <entry name="serial">9656898e-1d93-434d-88db-975744a112d3</entry>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <entry name="uuid">9656898e-1d93-434d-88db-975744a112d3</entry>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.config"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:05:6e:95"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <target dev="tap7f499c00-22"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/console.log" append="off"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:41:37 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:41:37 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:41:37 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:41:37 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.835 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Preparing to wait for external event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.835 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.836 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.837 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.837 2 DEBUG nova.virt.libvirt.vif [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-373575841',display_name='tempest-TestNetworkBasicOps-server-373575841',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-373575841',id=33,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMV4i3QrHLAODynicG5S7qXswKZBD7kSDxtdfVgro6wiuyOowbZQhzzd+gfAwKPyX+f6hoJ8rK8zgFtCqqftJoCldF3OXHWXCtpI3QNnw/GgFwuWJpq4kkVwJAG+UEZpYw==',key_name='tempest-TestNetworkBasicOps-1996767979',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-2xbvrdyf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:41:32Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=9656898e-1d93-434d-88db-975744a112d3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.837 2 DEBUG nova.network.os_vif_util [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.838 2 DEBUG nova.network.os_vif_util [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.838 2 DEBUG os_vif [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.839 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.839 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7f499c00-22, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.842 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7f499c00-22, col_values=(('external_ids', {'iface-id': '7f499c00-2230-423e-81b3-14ba642bb436', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:05:6e:95', 'vm-uuid': '9656898e-1d93-434d-88db-975744a112d3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:37 np0005470441 NetworkManager[51690]: <info>  [1759556497.8449] manager: (tap7f499c00-22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.854 2 INFO os_vif [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22')#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.915 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.915 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.915 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:05:6e:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:41:37 np0005470441 nova_compute[192626]: 2025-10-04 05:41:37.916 2 INFO nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Using config drive#033[00m
Oct  4 01:41:38 np0005470441 podman[227590]: 2025-10-04 05:41:38.327532118 +0000 UTC m=+0.065407000 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:41:38 np0005470441 nova_compute[192626]: 2025-10-04 05:41:38.969 2 INFO nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Creating config drive at /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.config#033[00m
Oct  4 01:41:38 np0005470441 nova_compute[192626]: 2025-10-04 05:41:38.976 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwcwqlyse execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.104 2 DEBUG oslo_concurrency.processutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwcwqlyse" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:41:39 np0005470441 kernel: tap7f499c00-22: entered promiscuous mode
Oct  4 01:41:39 np0005470441 NetworkManager[51690]: <info>  [1759556499.1745] manager: (tap7f499c00-22): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Oct  4 01:41:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:39Z|00242|binding|INFO|Claiming lport 7f499c00-2230-423e-81b3-14ba642bb436 for this chassis.
Oct  4 01:41:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:39Z|00243|binding|INFO|7f499c00-2230-423e-81b3-14ba642bb436: Claiming fa:16:3e:05:6e:95 10.100.0.18
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.223 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:6e:95 10.100.0.18'], port_security=['fa:16:3e:05:6e:95 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '9656898e-1d93-434d-88db-975744a112d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0d6ebe47-8fd5-4279-902c-577fcfa960dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=841d06f9-5fd2-480b-a960-98e4de1665c7, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=7f499c00-2230-423e-81b3-14ba642bb436) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.225 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 7f499c00-2230-423e-81b3-14ba642bb436 in datapath d2bb6534-edcf-4f06-bd53-7102e28ef382 bound to our chassis#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.227 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2bb6534-edcf-4f06-bd53-7102e28ef382#033[00m
Oct  4 01:41:39 np0005470441 systemd-machined[152624]: New machine qemu-18-instance-00000021.
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.245 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0075e924-c0d8-4b48-b1eb-df07dec2ff5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:39Z|00244|binding|INFO|Setting lport 7f499c00-2230-423e-81b3-14ba642bb436 ovn-installed in OVS
Oct  4 01:41:39 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:39Z|00245|binding|INFO|Setting lport 7f499c00-2230-423e-81b3-14ba642bb436 up in Southbound
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:39 np0005470441 systemd[1]: Started Virtual Machine qemu-18-instance-00000021.
Oct  4 01:41:39 np0005470441 systemd-udevd[227635]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:41:39 np0005470441 NetworkManager[51690]: <info>  [1759556499.2816] device (tap7f499c00-22): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:41:39 np0005470441 NetworkManager[51690]: <info>  [1759556499.2825] device (tap7f499c00-22): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.284 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[87e70d0f-5336-4cb5-9880-9d378f50a045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.287 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[3e57102b-ae74-484c-a785-9e65a1cfe2ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.314 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[503322ff-8e4d-4017-9c78-3fe35aa7217d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.331 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ecd833eb-c622-4c2a-a7a1-7e0298ca6442]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2bb6534-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:78:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438371, 'reachable_time': 39720, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227646, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.347 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[32702e86-3ce9-45dc-8f1d-c70b00d97ab2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapd2bb6534-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438383, 'tstamp': 438383}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227647, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd2bb6534-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438385, 'tstamp': 438385}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227647, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.349 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2bb6534-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.352 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2bb6534-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.352 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.352 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2bb6534-e0, col_values=(('external_ids', {'iface-id': '8bf48ff2-841a-4857-8f05-e5e03ecacf8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:41:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:41:39.353 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.961 2 DEBUG nova.compute.manager [req-611eae70-d219-46b8-ad15-a3cb97856679 req-e2578bfc-da99-4ac0-80b9-c841e68eaf18 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.961 2 DEBUG oslo_concurrency.lockutils [req-611eae70-d219-46b8-ad15-a3cb97856679 req-e2578bfc-da99-4ac0-80b9-c841e68eaf18 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.962 2 DEBUG oslo_concurrency.lockutils [req-611eae70-d219-46b8-ad15-a3cb97856679 req-e2578bfc-da99-4ac0-80b9-c841e68eaf18 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.962 2 DEBUG oslo_concurrency.lockutils [req-611eae70-d219-46b8-ad15-a3cb97856679 req-e2578bfc-da99-4ac0-80b9-c841e68eaf18 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:39 np0005470441 nova_compute[192626]: 2025-10-04 05:41:39.962 2 DEBUG nova.compute.manager [req-611eae70-d219-46b8-ad15-a3cb97856679 req-e2578bfc-da99-4ac0-80b9-c841e68eaf18 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Processing event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.145 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556500.1444788, 9656898e-1d93-434d-88db-975744a112d3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.145 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] VM Started (Lifecycle Event)#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.148 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.152 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.156 2 INFO nova.virt.libvirt.driver [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] Instance spawned successfully.#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.157 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.167 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.171 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.183 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.184 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.185 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.185 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.186 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.187 2 DEBUG nova.virt.libvirt.driver [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.193 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.194 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556500.1447132, 9656898e-1d93-434d-88db-975744a112d3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.194 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.233 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.247 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556500.1509166, 9656898e-1d93-434d-88db-975744a112d3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.248 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.275 2 INFO nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Took 7.56 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.275 2 DEBUG nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.277 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.284 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.335 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:41:40 np0005470441 podman[227656]: 2025-10-04 05:41:40.354294476 +0000 UTC m=+0.092076188 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.368 2 INFO nova.compute.manager [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Took 8.14 seconds to build instance.#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.394 2 DEBUG oslo_concurrency.lockutils [None req-16cfe61a-85a0-4aa9-8c16-ff688c51b414 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.428 2 DEBUG nova.network.neutron [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Updated VIF entry in instance network info cache for port 7f499c00-2230-423e-81b3-14ba642bb436. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.429 2 DEBUG nova.network.neutron [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Updating instance_info_cache with network_info: [{"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:41:40 np0005470441 nova_compute[192626]: 2025-10-04 05:41:40.451 2 DEBUG oslo_concurrency.lockutils [req-3b193c1f-140c-4395-8def-dcc4c1ae7175 req-b18d7bdf-38c9-414f-8ace-b798d50d2f1f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-9656898e-1d93-434d-88db-975744a112d3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.062 2 DEBUG nova.compute.manager [req-cea4a1e8-1082-4295-9e4d-4871bfe9adf3 req-649106c2-4b91-4968-9cea-8cf4f41ec84b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.063 2 DEBUG oslo_concurrency.lockutils [req-cea4a1e8-1082-4295-9e4d-4871bfe9adf3 req-649106c2-4b91-4968-9cea-8cf4f41ec84b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.063 2 DEBUG oslo_concurrency.lockutils [req-cea4a1e8-1082-4295-9e4d-4871bfe9adf3 req-649106c2-4b91-4968-9cea-8cf4f41ec84b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.063 2 DEBUG oslo_concurrency.lockutils [req-cea4a1e8-1082-4295-9e4d-4871bfe9adf3 req-649106c2-4b91-4968-9cea-8cf4f41ec84b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.063 2 DEBUG nova.compute.manager [req-cea4a1e8-1082-4295-9e4d-4871bfe9adf3 req-649106c2-4b91-4968-9cea-8cf4f41ec84b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] No waiting events found dispatching network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.063 2 WARNING nova.compute.manager [req-cea4a1e8-1082-4295-9e4d-4871bfe9adf3 req-649106c2-4b91-4968-9cea-8cf4f41ec84b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received unexpected event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:42 np0005470441 nova_compute[192626]: 2025-10-04 05:41:42.861 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:43 np0005470441 podman[227675]: 2025-10-04 05:41:43.348048729 +0000 UTC m=+0.098619564 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:41:47 np0005470441 nova_compute[192626]: 2025-10-04 05:41:47.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:47 np0005470441 nova_compute[192626]: 2025-10-04 05:41:47.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:51Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:05:6e:95 10.100.0.18
Oct  4 01:41:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:41:51Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:05:6e:95 10.100.0.18
Oct  4 01:41:52 np0005470441 podman[227713]: 2025-10-04 05:41:52.30971723 +0000 UTC m=+0.062353553 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=multipathd)
Oct  4 01:41:52 np0005470441 podman[227714]: 2025-10-04 05:41:52.321406322 +0000 UTC m=+0.071659128 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:41:52 np0005470441 nova_compute[192626]: 2025-10-04 05:41:52.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:52 np0005470441 nova_compute[192626]: 2025-10-04 05:41:52.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:56 np0005470441 podman[227755]: 2025-10-04 05:41:56.31624377 +0000 UTC m=+0.071874294 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:41:56 np0005470441 podman[227756]: 2025-10-04 05:41:56.317094854 +0000 UTC m=+0.067936222 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  4 01:41:57 np0005470441 nova_compute[192626]: 2025-10-04 05:41:57.723 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:57 np0005470441 nova_compute[192626]: 2025-10-04 05:41:57.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:41:59 np0005470441 nova_compute[192626]: 2025-10-04 05:41:59.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:01 np0005470441 nova_compute[192626]: 2025-10-04 05:42:01.231 2 DEBUG nova.compute.manager [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-changed-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:01 np0005470441 nova_compute[192626]: 2025-10-04 05:42:01.232 2 DEBUG nova.compute.manager [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing instance network info cache due to event network-changed-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:42:01 np0005470441 nova_compute[192626]: 2025-10-04 05:42:01.233 2 DEBUG oslo_concurrency.lockutils [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:42:01 np0005470441 nova_compute[192626]: 2025-10-04 05:42:01.233 2 DEBUG oslo_concurrency.lockutils [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:42:01 np0005470441 nova_compute[192626]: 2025-10-04 05:42:01.233 2 DEBUG nova.network.neutron [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing network info cache for port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.713 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'name': 'tempest-TestNetworkBasicOps-server-37334901', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001f', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7ec39d6d697445438e79b0bfc666a027', 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'hostId': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:42:02 np0005470441 nova_compute[192626]: 2025-10-04 05:42:02.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:02 np0005470441 nova_compute[192626]: 2025-10-04 05:42:02.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:02 np0005470441 nova_compute[192626]: 2025-10-04 05:42:02.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.716 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9656898e-1d93-434d-88db-975744a112d3', 'name': 'tempest-TestNetworkBasicOps-server-373575841', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000021', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7ec39d6d697445438e79b0bfc666a027', 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'hostId': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.717 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:42:02 np0005470441 nova_compute[192626]: 2025-10-04 05:42:02.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.750 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.read.requests volume: 1118 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.751 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.778 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.read.requests volume: 1066 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.779 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46804d31-03d3-4f13-bb7b-7d9fa7b86d12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1118, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.717487', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da8347c4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': '1bbde4db224b29999f115bff0127a7cced778b27c4a0884cef0023acc91e5ab0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.717487', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da835ef8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'dd023503e8266c24673facc5db32bce801748a3ddb6035a19b148377646927e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1066, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.717487', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da879d56-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': 'eee0387de03569922a1588a55ecbfb0b2cfcd2f7528bbdea2c6bb237356c77c0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.717487', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da87b336-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': '5e6df206a6e8637fbae69fd98f5b5033a426db26364e5f7860cb7523a3f0fc3a'}]}, 'timestamp': '2025-10-04 05:42:02.780162', '_unique_id': 'bca36bbd449e40eca202147a6ca3908b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.782 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.783 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.787 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for dfa10e04-6283-4c0a-94b0-6b4841e55401 / tapc638bfcb-e1 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.788 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for dfa10e04-6283-4c0a-94b0-6b4841e55401 / tap97c6d65d-ad inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.788 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.789 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.793 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9656898e-1d93-434d-88db-975744a112d3 / tap7f499c00-22 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.793 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '968b165f-8c16-49f8-8475-b93a580d203b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.783782', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da89232e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '3335954e85560e7577639a0a4d3427446852bd2f4250741c632d446cc58107c5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.783782', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da893b3e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '2e2e8a46020ace80467b593866997683be13294d758502ee7da34b8ee9147814'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.783782', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da89e2d2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '590656e4f8b761c5c629752b00c62aa83a2b7a430e45728adf5faf133f6ce20a'}]}, 'timestamp': '2025-10-04 05:42:02.794562', '_unique_id': 'cac79a2471884d77a98eb0b89600baf2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.795 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.797 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.814 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.814 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.829 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.830 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03c39c8a-9de7-4ced-8fd2-78ffee6e1070', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.797909', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da8cf6fc-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.512498084, 'message_signature': '1d2d31436801dd072a6868a35caeb6837df5289a232ce3d8967309eeea556eeb'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.797909', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da8d0ef8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.512498084, 'message_signature': '6b51cd38f34663cd1923cfd2358e88c651591f75e45634a0b3338d74cde0382a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.797909', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da8f6040-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.529807156, 'message_signature': 'e3106ff3557d669e2287d67c5a466e1887dec11120ff210a5ce20bd4086f2bf7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.797909', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da8f74c2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.529807156, 'message_signature': '0b0c61b7311dc5f8f05053c40245ccdd010b40051de1a1b162a4537474800975'}]}, 'timestamp': '2025-10-04 05:42:02.830992', '_unique_id': '177c2a2546fa4dbaa338e4ade662f449'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.832 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.833 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.834 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.834 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>]
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.834 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.834 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.read.bytes volume: 30837248 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.835 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.835 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.read.bytes volume: 29641216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.836 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e30b110f-9570-40b2-b3b5-41c73c592675', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30837248, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.834884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da901fee-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': '5bd677c934fda96f2ba2f8b6e5f92c262a1038bb8281e95cb9091b6be6ba3db7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.834884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9035ce-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'fd13b999f2a49157b4e0d35c8b4ab6b8df7a13ee40b3c8fee59e13c03293ca1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29641216, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.834884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da90469a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': '1eaf62d3c532045e096c3a7c810be38791883c4b932575bbaea8692cf56d40e3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.834884', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9057d4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': 'c112bbdab80d9e65cc3a4487ba1593ce86787e62b7ed713905a7467db936acfb'}]}, 'timestamp': '2025-10-04 05:42:02.836785', '_unique_id': '13766f380d85424095bf012fae87d767'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.837 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.839 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.839 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.839 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.840 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4734523e-ad35-4e61-b652-c7d40bdf0e4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.839296', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da90cd9a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'b1b4c655fb3a0eef59caba21e73a20e9048147a1f083618cf9de66d0037ae366'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.839296', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da90dfce-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'e3d39a347f1f887fa5f3eff33bf34b5851bd518430846b215d79ef820a20665f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.839296', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da90f0f4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '09ac5f5bfb70e91aa089ce86e313db1c28c3b2e50075950746d685960b29117d'}]}, 'timestamp': '2025-10-04 05:42:02.840755', '_unique_id': '7051a8ccd7bf4743a9348b0888f23c02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.841 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.843 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.843 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.write.latency volume: 2950247045 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.844 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.844 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.write.latency volume: 2771593584 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.845 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8206a66c-b9d6-4b17-b87a-0501f44ea48a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2950247045, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.843385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da917236-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': '939aada79fc9ea03c07cddc895fbfe366fc10e74b68e3a2aab732961eeed69b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.843385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da918a78-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': '4e2c042559602fae7cd8709ec443bbe801e4aa11c6290624a2b37a78803ad9bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2771593584, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.843385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da91a058-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': '7c4f5efce8b85b703bce2c7da4df5c307562739293f57ab4e19f6e0508769d59'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.843385', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da91b106-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': '3292f36ce2fdb19028bfe966d48d7ca5c80557a48282d2ac1218b2e5e7b1ac9e'}]}, 'timestamp': '2025-10-04 05:42:02.845661', '_unique_id': '32deaa77d01240b688ef801700015199'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.846 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.848 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.848 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.849 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d7590fa-ce79-42ab-aec6-e2d8114e3c16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.848263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da922bb8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '0ad083449002b0d3591f869ffe37dd8ef622aef6b5dc80549d463f240dd3a153'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.848263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da923dd8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '7db3dc6aa8b6f29aff01f02e21a52944bba44b46432157df1e5b3f9d23987d4e'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.848263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da924ed6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '6c67ac1bec622ee71718335144b95b6f227dcbc42ad22b67c3019de2f51f815f'}]}, 'timestamp': '2025-10-04 05:42:02.849763', '_unique_id': '9c60ee2ee2054f38baa48111975c670c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.852 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.852 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.read.latency volume: 541308284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.852 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.read.latency volume: 83637400 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.853 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.read.latency volume: 483224935 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.853 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.read.latency volume: 66401348 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '125afbf0-f092-44e0-a168-69d4fa8aca31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 541308284, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.852198', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da92c3a2-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'f8b14f83ded7aee23818ea5b2bbabe7baec4235bd937edcef3db7476e3254793'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 83637400, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.852198', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da92d7de-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'da5f9b2d066386693c0ce93b46f6376b59b2ae92e1bc942f13c19d223e24829d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 483224935, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.852198', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da92e832-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': '8554541f09b4324369e1e928bc191504ca1578b833211315412fd1f2049136a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 66401348, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.852198', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da92fcfa-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': 'bd149caf7715cf676c6d757b8db83d1304a20ebcf0d9168058490aa8c3285fb5'}]}, 'timestamp': '2025-10-04 05:42:02.854120', '_unique_id': '4da14b51a20f46f287cf3e575aaba7ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.855 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.856 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.856 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.856 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>]
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.857 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.857 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.858 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.858 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06597de2-641e-469d-a425-d52833a10ee9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.857456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da939ba6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'df9b1dc6b9b0da7ba38e10b6f30adbb209933a087e46f5a7e93f7e61b627745f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.857456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da93afc4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '5747bbab1e72ac4f25fdd050bfad59a8860a2cbe0a5598544118943449879b47'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.857456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da93c568-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': 'ce57f2bc5a36afca332e0758ebedb988b00b5cde786ae83e9aadf243a42c5162'}]}, 'timestamp': '2025-10-04 05:42:02.859346', '_unique_id': 'f398e920456c46cba82ff1d2e8600389'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.860 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.862 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.881 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/memory.usage volume: 43.78125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.900 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/memory.usage volume: 42.59375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 nova_compute[192626]: 2025-10-04 05:42:02.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '05a57d41-6d12-4377-942e-3593712bf6cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.78125, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'timestamp': '2025-10-04T05:42:02.862668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da973d88-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.595897924, 'message_signature': '8df1ed3b682bf7e6c082535a80de745ce1a97f0db71597d0d89e6a773d41fa0e'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.59375, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3', 'timestamp': '2025-10-04T05:42:02.862668', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'da9a2ebc-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.61510927, 'message_signature': '3fe24336f62764ed11050796a8acb638766eea9218daa2bea7a4f8a14efcbaf7'}]}, 'timestamp': '2025-10-04 05:42:02.901251', '_unique_id': '9a086d67c11f4e75b4d0e2d9ca3f115d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.902 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.903 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.903 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/cpu volume: 11920000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.904 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/cpu volume: 11050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92748ce-9027-48cf-a51f-7b2ef6c96da7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11920000000, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'timestamp': '2025-10-04T05:42:02.903777', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da9a9f28-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.595897924, 'message_signature': '6f4e28761ba0cc7fe29e64ffbb28f43d07cbb46e80bd1422ff924a5e11f5106b'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11050000000, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3', 'timestamp': '2025-10-04T05:42:02.903777', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'da9aa9c8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.61510927, 'message_signature': '355555e02c64c7910550bd7a68ea67cb750e4c7c7fb3626abd6e1bdd494875d1'}]}, 'timestamp': '2025-10-04 05:42:02.904360', '_unique_id': '64a4c5ae330745cdb71e7bdb988ee741'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.905 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.906 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.write.requests volume: 331 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.906 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.906 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.906 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '878b520c-ee95-4e9b-b0ce-06071475aa83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 331, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.906022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9af84c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': '97652a2218a7209bdac8cf38f53a1c5da8bf7cc39f2c82ac98838864bf47522f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.906022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9b0512-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'eb300e4a81bf139a07d025071142067073e0f8b7051e97a485c70df60ab76ff8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.906022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9b10ca-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': 'a6b10ea702fa9b46490edfff076a6c0d9348cbd0e169f20216dc654cfec97c6a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.906022', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9b1b4c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': '7dec5054acfc4e860844ccc29f2f89eb14968571ed643e3b76c2cb0c007ce0dc'}]}, 'timestamp': '2025-10-04 05:42:02.907249', '_unique_id': 'a0633b16894c43d3ae63c2c7e2a805f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.907 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.908 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.908 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>]
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.909 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.909 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.write.bytes volume: 73125888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.909 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.909 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.910 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a5b981b-99d3-418c-9753-39247f228741', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73125888, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.909239', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9b747a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'c7fa480ff91e9421dae6d40fd32f6257dbc422fbe12bd238b31f66c2b854605f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.909239', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9b82f8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.432102199, 'message_signature': 'c07c7b0caca89cc8c787870f38b39a8382f067c6d603082ce0e47ab4c1f88bfa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.909239', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9b8dac-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': 'd7ce12275186c4e60f30ebf0573b4fc92b9ab6fc689f3109bfc4a0147e100c34'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.909239', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9b996e-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.466346542, 'message_signature': 'de1b33753267cfcda47756479e4d7c0c005202583e972b42bb63a95f6adf04fa'}]}, 'timestamp': '2025-10-04 05:42:02.910492', '_unique_id': '0c73a8beedb34cb5aeb4ba39196b55aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.911 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.912 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.912 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.913 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9195a3a6-04d0-48b9-99a0-c9fb84a89659', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.912307', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da9becca-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '885e4c67e9cfddd629a325fdf46cb1bee839efac37ea68bfff9d5d4f5cbe18d5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.912307', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da9bfa6c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '6cb1c3ced2f0e52dca7afa029368358e1dfbf7a4305db519510c7773ea308681'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.912307', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da9c0dcc-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '556777033f6a375bd527951e2ccca8468e1c9ef5159c733465fe1657fbee1eb4'}]}, 'timestamp': '2025-10-04 05:42:02.913480', '_unique_id': '6e770d0b6e7f43038173f30ed80c8bc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.914 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.915 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.915 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.bytes volume: 61718 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.915 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.bytes volume: 3186 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.916 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.outgoing.bytes volume: 1984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0efd61c-47be-40ea-8ba1-28cb70c7c6a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 61718, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.915500', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da9c6ac4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'a2d175556fc8e0ca7e8684578212c47a361ae1137989ef40abb4584b6c1150a2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3186, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.915500', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da9c7794-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'ccc36547de955070181894788847469aef07aee3d03529ae8a2f5e18f62483d2'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1984, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.915500', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da9c8504-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': 'e1d359bdefcdea008ddfa74c939fa90d989b379bdc969ca004f8dc5580586660'}]}, 'timestamp': '2025-10-04 05:42:02.916576', '_unique_id': 'b93b866923f4417e9be9b4f97468b7a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.917 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.918 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.918 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-37334901>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-373575841>]
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.918 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.918 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.919 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.919 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.919 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cd2bd14-94fe-4b44-86d9-b3cddc3f371b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.918713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9ce648-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.512498084, 'message_signature': '317a4575b0f42e951837e19bcb52b0fecab5f3c18a85ebc8e0966305cf6b0060'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.918713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9cf200-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.512498084, 'message_signature': '2f6758ba7975ce3194b8920dba2f1309254ba5e54108ada0fc7ad72c1475119b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.918713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9cfce6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.529807156, 'message_signature': '2fac924142a0bb7572a9a5ec2cf8f1419c6ef7512816e216c8429a0a7858a630'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.918713', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9d09e8-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.529807156, 'message_signature': '3ffb3d16408d82d8511230e2c25c477dc89e8b6b0402dc47b45d2c21f08a1484'}]}, 'timestamp': '2025-10-04 05:42:02.919911', '_unique_id': '919775a6e5fa44c8b118d97533dad3e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.920 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.921 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.packets volume: 371 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.922 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.packets volume: 21 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.922 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.incoming.packets volume: 24 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f25c57ec-ae33-4c91-8a25-47c5e952bdfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 371, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.921656', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da9d5b0a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '7222b0d9dad852d5cb5ece03da5b1c1bb07d3b88b99d73be16506a95167ae12f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 21, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.921656', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da9d6640-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '97abfbe32c8990519ae73c216a058d8c5088868be0d5f13df0a176e42c987f9b'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 24, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.921656', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da9d7284-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '3a9554d9b18854fec386b0a8508c89b2cc7981fadff9a7a04b2525367bda8c0e'}]}, 'timestamp': '2025-10-04 05:42:02.922687', '_unique_id': '3c6a361d7176412d91eb40da4c92e0d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.923 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.924 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.924 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.bytes volume: 71807 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.925 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.bytes volume: 2274 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.925 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.incoming.bytes volume: 2744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '668158b8-cffd-4adb-b9c3-36de256dfa28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 71807, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.924706', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da9dd814-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '98674efccd4f24315eae4f30aa9cc399afc773a06056a81e7d353541670d0197'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2274, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.924706', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da9de6ec-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'ef94e8c5c4d36c25aaad1a5db47611abaad7175df54b8dea3c16653194092d42'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2744, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.924706', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da9df3ee-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '534cc0b376d105eb7f80892c108a929c1bf9c9598f01283b6230f0109e1a0b80'}]}, 'timestamp': '2025-10-04 05:42:02.925912', '_unique_id': 'c5a21552fcc848838fd935f6b8d3c8e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.926 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.927 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.927 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.packets volume: 402 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.927 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.outgoing.packets volume: 33 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.928 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.outgoing.packets volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3f445caf-4810-4026-a0a1-26b60371015b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 402, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.927541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da9e3fb6-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'd68f8bf1a897b67a2f293d8c6f95c4fde1333ca817b4e30a30eea300240749f6'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 33, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.927541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da9e4bb4-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'f169478a1515434cfa24027b35d456953c3b77cedb03b2ca24fd45a0adc3f8b7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 20, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.927541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da9e5794-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '9dbae17810a304f38f094e10ea62b028507ddfbe05a046583365b015dd5d35a1'}]}, 'timestamp': '2025-10-04 05:42:02.928507', '_unique_id': '754bdcb6fb2a4205a59db9ab75880fd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.929 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.930 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.931 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.931 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.931 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28078552-69b3-4ebd-9f6b-21d7f76244d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tapc638bfcb-e1', 'timestamp': '2025-10-04T05:42:02.931035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tapc638bfcb-e1', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:37:07:15', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc638bfcb-e1'}, 'message_id': 'da9ec83c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': '323024a5eb6a5365c3477bfc18f0d637f9f35952e82ce4f5812f7ef2a23b0eab'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-0000001f-dfa10e04-6283-4c0a-94b0-6b4841e55401-tap97c6d65d-ad', 'timestamp': '2025-10-04T05:42:02.931035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'tap97c6d65d-ad', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0d:03:22', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap97c6d65d-ad'}, 'message_id': 'da9ed408-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.498396263, 'message_signature': 'c8d78a00a426eaa73cac010ea5e011865ca238d0094caf12a0d49bc60e96c39a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'instance-00000021-9656898e-1d93-434d-88db-975744a112d3-tap7f499c00-22', 'timestamp': '2025-10-04T05:42:02.931035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'tap7f499c00-22', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:05:6e:95', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7f499c00-22'}, 'message_id': 'da9ee042-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.504741503, 'message_signature': '619d46f7bbac39fac783e54b2ce0dc5669a1816651aaf331f846f29c2c5d32e0'}]}, 'timestamp': '2025-10-04 05:42:02.931980', '_unique_id': '8a8ba6b7fe6842e5bb0c06ee4394d526'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.932 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.933 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.933 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.934 12 DEBUG ceilometer.compute.pollsters [-] dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.934 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.935 12 DEBUG ceilometer.compute.pollsters [-] 9656898e-1d93-434d-88db-975744a112d3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eea9916f-b97d-4955-a98d-159bac2bea11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-vda', 'timestamp': '2025-10-04T05:42:02.933800', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9f3452-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.512498084, 'message_signature': 'fb47cac59fbdb02a716f39adc4356c22615217ede867711b06e1b81c03dbd885'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401-sda', 'timestamp': '2025-10-04T05:42:02.933800', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-37334901', 'name': 'instance-0000001f', 'instance_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9f428a-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.512498084, 'message_signature': '06b9802e9f4ef69aa7f33234e29931a8e843b8b1334cc40fb4ab55057e319d31'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-vda', 'timestamp': '2025-10-04T05:42:02.933800', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'da9f5644-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.529807156, 'message_signature': '904d2cfe8673f36156be300ff1e2a8f468c9841dcf3e313b444437894a1fdfd4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_name': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_name': None, 'resource_id': '9656898e-1d93-434d-88db-975744a112d3-sda', 'timestamp': '2025-10-04T05:42:02.933800', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-373575841', 'name': 'instance-00000021', 'instance_id': '9656898e-1d93-434d-88db-975744a112d3', 'instance_type': 'm1.nano', 'host': 'e9e6ce69f8dca552ce038c266c49006d4c938d11869cfc6fe1a7f3bd', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'da9f665c-a0e4-11f0-8814-fa163ed2379c', 'monotonic_time': 4420.529807156, 'message_signature': '8ea31e305f5484d7f9158e812d9f12b9aec49417f44499c804040ff5fa3e7fb1'}]}, 'timestamp': '2025-10-04 05:42:02.935451', '_unique_id': '327dd3fd48384bb1907c5699659ae4cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:42:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:42:02.936 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.476 2 DEBUG nova.network.neutron [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updated VIF entry in instance network info cache for port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.477 2 DEBUG nova.network.neutron [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.506 2 DEBUG oslo_concurrency.lockutils [req-9276a65b-e01b-451c-8d75-5205f3519402 req-42f401cb-4220-4a05-9fbd-c5cdbbcf8d2b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.746 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.835 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:42:04 np0005470441 podman[227796]: 2025-10-04 05:42:04.882923665 +0000 UTC m=+0.081332043 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.908 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.909 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.966 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:42:04 np0005470441 nova_compute[192626]: 2025-10-04 05:42:04.974 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.033 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.034 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.086 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.258 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.260 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5426MB free_disk=73.37055587768555GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.261 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.261 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.336 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance dfa10e04-6283-4c0a-94b0-6b4841e55401 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.336 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 9656898e-1d93-434d-88db-975744a112d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.336 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.337 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.356 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.371 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.371 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.391 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.433 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.497 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.521 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.596 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:42:05 np0005470441 nova_compute[192626]: 2025-10-04 05:42:05.597 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:06 np0005470441 nova_compute[192626]: 2025-10-04 05:42:06.597 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:06 np0005470441 nova_compute[192626]: 2025-10-04 05:42:06.597 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:42:06 np0005470441 nova_compute[192626]: 2025-10-04 05:42:06.597 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:42:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:06.749 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:06.750 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:06.751 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:07 np0005470441 nova_compute[192626]: 2025-10-04 05:42:07.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:07 np0005470441 nova_compute[192626]: 2025-10-04 05:42:07.773 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:42:07 np0005470441 nova_compute[192626]: 2025-10-04 05:42:07.773 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:42:07 np0005470441 nova_compute[192626]: 2025-10-04 05:42:07.774 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:42:07 np0005470441 nova_compute[192626]: 2025-10-04 05:42:07.774 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:42:07 np0005470441 nova_compute[192626]: 2025-10-04 05:42:07.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:09 np0005470441 podman[227829]: 2025-10-04 05:42:09.337215871 +0000 UTC m=+0.078430440 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:42:11 np0005470441 podman[227856]: 2025-10-04 05:42:11.310246411 +0000 UTC m=+0.063834595 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:42:11 np0005470441 nova_compute[192626]: 2025-10-04 05:42:11.928 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:11 np0005470441 nova_compute[192626]: 2025-10-04 05:42:11.958 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:42:11 np0005470441 nova_compute[192626]: 2025-10-04 05:42:11.958 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:42:11 np0005470441 nova_compute[192626]: 2025-10-04 05:42:11.959 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:11 np0005470441 nova_compute[192626]: 2025-10-04 05:42:11.959 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:11 np0005470441 nova_compute[192626]: 2025-10-04 05:42:11.959 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:12 np0005470441 nova_compute[192626]: 2025-10-04 05:42:12.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:12 np0005470441 nova_compute[192626]: 2025-10-04 05:42:12.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:14 np0005470441 podman[227877]: 2025-10-04 05:42:14.333523563 +0000 UTC m=+0.084762350 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller)
Oct  4 01:42:14 np0005470441 nova_compute[192626]: 2025-10-04 05:42:14.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:42:17 np0005470441 nova_compute[192626]: 2025-10-04 05:42:17.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:17 np0005470441 nova_compute[192626]: 2025-10-04 05:42:17.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:21Z|00246|binding|INFO|Releasing lport 8bf48ff2-841a-4857-8f05-e5e03ecacf8c from this chassis (sb_readonly=0)
Oct  4 01:42:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:21Z|00247|binding|INFO|Releasing lport 224f2340-bf1e-48df-8648-a854ae221536 from this chassis (sb_readonly=0)
Oct  4 01:42:21 np0005470441 nova_compute[192626]: 2025-10-04 05:42:21.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:22 np0005470441 nova_compute[192626]: 2025-10-04 05:42:22.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:22 np0005470441 nova_compute[192626]: 2025-10-04 05:42:22.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:22.807 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:42:22 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:22.809 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:42:22 np0005470441 nova_compute[192626]: 2025-10-04 05:42:22.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:23 np0005470441 podman[227903]: 2025-10-04 05:42:23.311080797 +0000 UTC m=+0.054538401 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:42:23 np0005470441 podman[227904]: 2025-10-04 05:42:23.329588263 +0000 UTC m=+0.068835007 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:42:24 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:24.812 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:27 np0005470441 podman[227951]: 2025-10-04 05:42:27.344069209 +0000 UTC m=+0.076836694 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:42:27 np0005470441 podman[227950]: 2025-10-04 05:42:27.357161561 +0000 UTC m=+0.099071926 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:42:27 np0005470441 nova_compute[192626]: 2025-10-04 05:42:27.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:27 np0005470441 nova_compute[192626]: 2025-10-04 05:42:27.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:31 np0005470441 nova_compute[192626]: 2025-10-04 05:42:31.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.114 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.114 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.115 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.115 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.115 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.116 2 INFO nova.compute.manager [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Terminating instance#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.117 2 DEBUG nova.compute.manager [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:42:32 np0005470441 kernel: tap7f499c00-22 (unregistering): left promiscuous mode
Oct  4 01:42:32 np0005470441 NetworkManager[51690]: <info>  [1759556552.1486] device (tap7f499c00-22): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:32Z|00248|binding|INFO|Releasing lport 7f499c00-2230-423e-81b3-14ba642bb436 from this chassis (sb_readonly=0)
Oct  4 01:42:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:32Z|00249|binding|INFO|Setting lport 7f499c00-2230-423e-81b3-14ba642bb436 down in Southbound
Oct  4 01:42:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:32Z|00250|binding|INFO|Removing iface tap7f499c00-22 ovn-installed in OVS
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.170 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:05:6e:95 10.100.0.18'], port_security=['fa:16:3e:05:6e:95 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '9656898e-1d93-434d-88db-975744a112d3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0d6ebe47-8fd5-4279-902c-577fcfa960dd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=841d06f9-5fd2-480b-a960-98e4de1665c7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=7f499c00-2230-423e-81b3-14ba642bb436) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.172 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 7f499c00-2230-423e-81b3-14ba642bb436 in datapath d2bb6534-edcf-4f06-bd53-7102e28ef382 unbound from our chassis#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.174 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2bb6534-edcf-4f06-bd53-7102e28ef382#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.190 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b9771552-ed12-40e5-b625-6ddfc25f798e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.221 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[53955701-bae8-4f0d-8f6b-df38243a7a12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:32 np0005470441 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000021.scope: Deactivated successfully.
Oct  4 01:42:32 np0005470441 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000021.scope: Consumed 14.324s CPU time.
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.225 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ec126c4f-f849-4df3-983c-3fffe8573e92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:32 np0005470441 systemd-machined[152624]: Machine qemu-18-instance-00000021 terminated.
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.260 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[6268eb0d-e9e9-4f2d-9754-8eb9bf830003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.283 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ce7d9cd4-44d7-430c-a932-22cc0dec38df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2bb6534-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:78:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 7, 'rx_bytes': 1042, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 7, 'rx_bytes': 1042, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438371, 'reachable_time': 33894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228000, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.298 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c93f326f-fa40-4aa0-889f-e54fb11b1339]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapd2bb6534-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438383, 'tstamp': 438383}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228001, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd2bb6534-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 438385, 'tstamp': 438385}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228001, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.299 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2bb6534-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.307 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2bb6534-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.307 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.307 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2bb6534-e0, col_values=(('external_ids', {'iface-id': '8bf48ff2-841a-4857-8f05-e5e03ecacf8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:32.307 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.379 2 INFO nova.virt.libvirt.driver [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] Instance destroyed successfully.#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.380 2 DEBUG nova.objects.instance [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 9656898e-1d93-434d-88db-975744a112d3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.422 2 DEBUG nova.virt.libvirt.vif [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:41:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-373575841',display_name='tempest-TestNetworkBasicOps-server-373575841',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-373575841',id=33,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMV4i3QrHLAODynicG5S7qXswKZBD7kSDxtdfVgro6wiuyOowbZQhzzd+gfAwKPyX+f6hoJ8rK8zgFtCqqftJoCldF3OXHWXCtpI3QNnw/GgFwuWJpq4kkVwJAG+UEZpYw==',key_name='tempest-TestNetworkBasicOps-1996767979',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:41:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-2xbvrdyf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:41:40Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=9656898e-1d93-434d-88db-975744a112d3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.422 2 DEBUG nova.network.os_vif_util [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "7f499c00-2230-423e-81b3-14ba642bb436", "address": "fa:16:3e:05:6e:95", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7f499c00-22", "ovs_interfaceid": "7f499c00-2230-423e-81b3-14ba642bb436", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.423 2 DEBUG nova.network.os_vif_util [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.423 2 DEBUG os_vif [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.425 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7f499c00-22, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.432 2 INFO os_vif [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:05:6e:95,bridge_name='br-int',has_traffic_filtering=True,id=7f499c00-2230-423e-81b3-14ba642bb436,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7f499c00-22')#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.433 2 INFO nova.virt.libvirt.driver [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Deleting instance files /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3_del#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.434 2 INFO nova.virt.libvirt.driver [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Deletion of /var/lib/nova/instances/9656898e-1d93-434d-88db-975744a112d3_del complete#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.499 2 DEBUG nova.compute.manager [req-772242c7-c2c7-4f7d-94f7-e55a0d9b7fe4 req-923674bc-9bdb-4393-8dfd-3f7a70ac67b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-vif-unplugged-7f499c00-2230-423e-81b3-14ba642bb436 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.499 2 DEBUG oslo_concurrency.lockutils [req-772242c7-c2c7-4f7d-94f7-e55a0d9b7fe4 req-923674bc-9bdb-4393-8dfd-3f7a70ac67b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.500 2 DEBUG oslo_concurrency.lockutils [req-772242c7-c2c7-4f7d-94f7-e55a0d9b7fe4 req-923674bc-9bdb-4393-8dfd-3f7a70ac67b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.500 2 DEBUG oslo_concurrency.lockutils [req-772242c7-c2c7-4f7d-94f7-e55a0d9b7fe4 req-923674bc-9bdb-4393-8dfd-3f7a70ac67b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.500 2 DEBUG nova.compute.manager [req-772242c7-c2c7-4f7d-94f7-e55a0d9b7fe4 req-923674bc-9bdb-4393-8dfd-3f7a70ac67b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] No waiting events found dispatching network-vif-unplugged-7f499c00-2230-423e-81b3-14ba642bb436 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.501 2 DEBUG nova.compute.manager [req-772242c7-c2c7-4f7d-94f7-e55a0d9b7fe4 req-923674bc-9bdb-4393-8dfd-3f7a70ac67b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-vif-unplugged-7f499c00-2230-423e-81b3-14ba642bb436 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.507 2 INFO nova.compute.manager [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.508 2 DEBUG oslo.service.loopingcall [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.508 2 DEBUG nova.compute.manager [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.508 2 DEBUG nova.network.neutron [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:42:32 np0005470441 nova_compute[192626]: 2025-10-04 05:42:32.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.347 2 DEBUG nova.network.neutron [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.383 2 INFO nova.compute.manager [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] Took 1.87 seconds to deallocate network for instance.#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.464 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.465 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.509 2 DEBUG nova.compute.manager [req-0ea8e56a-5bca-457b-be2d-a908579a109e req-554cf900-54e6-48cd-99c4-949fb5a574f8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-vif-deleted-7f499c00-2230-423e-81b3-14ba642bb436 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.592 2 DEBUG nova.compute.provider_tree [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.621 2 DEBUG nova.scheduler.client.report [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.628 2 DEBUG nova.compute.manager [req-f7c9dcc1-a9a4-40a6-a662-937d57790e7d req-e5d9540e-3c2a-4ff4-9d6f-480fcf59085e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.629 2 DEBUG oslo_concurrency.lockutils [req-f7c9dcc1-a9a4-40a6-a662-937d57790e7d req-e5d9540e-3c2a-4ff4-9d6f-480fcf59085e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "9656898e-1d93-434d-88db-975744a112d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.629 2 DEBUG oslo_concurrency.lockutils [req-f7c9dcc1-a9a4-40a6-a662-937d57790e7d req-e5d9540e-3c2a-4ff4-9d6f-480fcf59085e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.629 2 DEBUG oslo_concurrency.lockutils [req-f7c9dcc1-a9a4-40a6-a662-937d57790e7d req-e5d9540e-3c2a-4ff4-9d6f-480fcf59085e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.629 2 DEBUG nova.compute.manager [req-f7c9dcc1-a9a4-40a6-a662-937d57790e7d req-e5d9540e-3c2a-4ff4-9d6f-480fcf59085e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] No waiting events found dispatching network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.630 2 WARNING nova.compute.manager [req-f7c9dcc1-a9a4-40a6-a662-937d57790e7d req-e5d9540e-3c2a-4ff4-9d6f-480fcf59085e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 9656898e-1d93-434d-88db-975744a112d3] Received unexpected event network-vif-plugged-7f499c00-2230-423e-81b3-14ba642bb436 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.648 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.678 2 INFO nova.scheduler.client.report [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 9656898e-1d93-434d-88db-975744a112d3#033[00m
Oct  4 01:42:34 np0005470441 nova_compute[192626]: 2025-10-04 05:42:34.794 2 DEBUG oslo_concurrency.lockutils [None req-2e5a55d3-b14b-4b5e-b2ca-e1cf3fc5783d b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "9656898e-1d93-434d-88db-975744a112d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:35 np0005470441 podman[228019]: 2025-10-04 05:42:35.34341032 +0000 UTC m=+0.081229170 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.911 2 DEBUG oslo_concurrency.lockutils [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "interface-dfa10e04-6283-4c0a-94b0-6b4841e55401-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.912 2 DEBUG oslo_concurrency.lockutils [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "interface-dfa10e04-6283-4c0a-94b0-6b4841e55401-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.941 2 DEBUG nova.objects.instance [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'flavor' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.968 2 DEBUG nova.virt.libvirt.vif [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.969 2 DEBUG nova.network.os_vif_util [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.969 2 DEBUG nova.network.os_vif_util [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.974 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.977 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.979 2 DEBUG nova.virt.libvirt.driver [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Attempting to detach device tap97c6d65d-ad from instance dfa10e04-6283-4c0a-94b0-6b4841e55401 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.979 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] detach device xml: <interface type="ethernet">
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <mac address="fa:16:3e:0d:03:22"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <model type="virtio"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <mtu size="1442"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <target dev="tap97c6d65d-ad"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]: </interface>
Oct  4 01:42:35 np0005470441 nova_compute[192626]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.989 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  4 01:42:35 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.993 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface>not found in domain: <domain type='kvm' id='17'>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <name>instance-0000001f</name>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <uuid>dfa10e04-6283-4c0a-94b0-6b4841e55401</uuid>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:41:26</nova:creationTime>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <nova:port uuid="97c6d65d-ad9d-4e43-a41c-301d4df8ccdd">
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:42:35 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <memory unit='KiB'>131072</memory>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <vcpu placement='static'>1</vcpu>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <resource>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <partition>/machine</partition>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </resource>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <sysinfo type='smbios'>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <entry name='manufacturer'>RDO</entry>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <entry name='product'>OpenStack Compute</entry>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <entry name='serial'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <entry name='uuid'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:      <entry name='family'>Virtual Machine</entry>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <boot dev='hd'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <smbios mode='sysinfo'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <vmcoreinfo state='on'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <cpu mode='custom' match='exact' check='full'>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <model fallback='forbid'>Nehalem</model>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <feature policy='require' name='x2apic'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <feature policy='require' name='hypervisor'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <feature policy='require' name='vme'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  <clock offset='utc'>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <timer name='pit' tickpolicy='delay'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:    <timer name='hpet' present='no'/>
Oct  4 01:42:35 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <on_poweroff>destroy</on_poweroff>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <on_reboot>restart</on_reboot>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <on_crash>destroy</on_crash>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <disk type='file' device='disk'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk' index='2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <backingStore type='file' index='3'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <format type='raw'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <source file='/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <backingStore/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      </backingStore>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='vda' bus='virtio'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='virtio-disk0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <disk type='file' device='cdrom'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='qemu' type='raw' cache='none'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config' index='1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <backingStore/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='sda' bus='sata'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <readonly/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='sata0-0-0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='0' model='pcie-root'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pcie.0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='1' port='0x10'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='2' port='0x11'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='3' port='0x12'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='4' port='0x13'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='5' port='0x14'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='6' port='0x15'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='7' port='0x16'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='8' port='0x17'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.8'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='9' port='0x18'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.9'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='10' port='0x19'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.10'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='11' port='0x1a'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.11'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='12' port='0x1b'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.12'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='13' port='0x1c'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.13'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='14' port='0x1d'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.14'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='15' port='0x1e'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.15'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='16' port='0x1f'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.16'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='17' port='0x20'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.17'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='18' port='0x21'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.18'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='19' port='0x22'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.19'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='20' port='0x23'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.20'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='21' port='0x24'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.21'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='22' port='0x25'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.22'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='23' port='0x26'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.23'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='24' port='0x27'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.24'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='25' port='0x28'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.25'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-pci-bridge'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.26'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='usb'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='sata' index='0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='ide'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <interface type='ethernet'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <mac address='fa:16:3e:37:07:15'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='tapc638bfcb-e1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model type='virtio'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='vhost' rx_queue_size='512'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <mtu size='1442'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='net0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <interface type='ethernet'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <mac address='fa:16:3e:0d:03:22'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='tap97c6d65d-ad'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model type='virtio'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='vhost' rx_queue_size='512'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <mtu size='1442'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='net1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <serial type='pty'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target type='isa-serial' port='0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <model name='isa-serial'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      </target>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <console type='pty' tty='/dev/pts/0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target type='serial' port='0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </console>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <input type='tablet' bus='usb'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='input0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='usb' bus='0' port='1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <input type='mouse' bus='ps2'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='input1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <input type='keyboard' bus='ps2'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='input2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <listen type='address' address='::0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <audio id='1' type='none'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model type='virtio' heads='1' primary='yes'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='video0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <watchdog model='itco' action='reset'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='watchdog0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </watchdog>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <memballoon model='virtio'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <stats period='10'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='balloon0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <rng model='virtio'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <backend model='random'>/dev/urandom</backend>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='rng0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <label>system_u:system_r:svirt_t:s0:c684,c998</label>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c684,c998</imagelabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <label>+107:+107</label>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <imagelabel>+107:+107</imagelabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.995 2 INFO nova.virt.libvirt.driver [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully detached device tap97c6d65d-ad from instance dfa10e04-6283-4c0a-94b0-6b4841e55401 from the persistent domain config.#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.995 2 DEBUG nova.virt.libvirt.driver [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] (1/8): Attempting to detach device tap97c6d65d-ad with device alias net1 from instance dfa10e04-6283-4c0a-94b0-6b4841e55401 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:35.996 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] detach device xml: <interface type="ethernet">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <mac address="fa:16:3e:0d:03:22"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <model type="virtio"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <mtu size="1442"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <target dev="tap97c6d65d-ad"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: </interface>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  4 01:42:36 np0005470441 kernel: tap97c6d65d-ad (unregistering): left promiscuous mode
Oct  4 01:42:36 np0005470441 NetworkManager[51690]: <info>  [1759556556.0904] device (tap97c6d65d-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:42:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:36Z|00251|binding|INFO|Releasing lport 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd from this chassis (sb_readonly=0)
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:36Z|00252|binding|INFO|Setting lport 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd down in Southbound
Oct  4 01:42:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:36Z|00253|binding|INFO|Removing iface tap97c6d65d-ad ovn-installed in OVS
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.116 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:03:22 10.100.0.24', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=841d06f9-5fd2-480b-a960-98e4de1665c7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.117 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd in datapath d2bb6534-edcf-4f06-bd53-7102e28ef382 unbound from our chassis#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.118 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2bb6534-edcf-4f06-bd53-7102e28ef382, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.119 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0e8a2016-c4c4-428c-b59f-b05005f2978f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.119 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382 namespace which is not needed anymore#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.119 2 DEBUG nova.virt.libvirt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Received event <DeviceRemovedEvent: 1759556556.1194139, dfa10e04-6283-4c0a-94b0-6b4841e55401 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.121 2 DEBUG nova.virt.libvirt.driver [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Start waiting for the detach event from libvirt for device tap97c6d65d-ad with device alias net1 for instance dfa10e04-6283-4c0a-94b0-6b4841e55401 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.122 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.126 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface>not found in domain: <domain type='kvm' id='17'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <name>instance-0000001f</name>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <uuid>dfa10e04-6283-4c0a-94b0-6b4841e55401</uuid>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:41:26</nova:creationTime>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:port uuid="97c6d65d-ad9d-4e43-a41c-301d4df8ccdd">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <memory unit='KiB'>131072</memory>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <vcpu placement='static'>1</vcpu>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <resource>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <partition>/machine</partition>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </resource>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <sysinfo type='smbios'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <entry name='manufacturer'>RDO</entry>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <entry name='product'>OpenStack Compute</entry>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <entry name='serial'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <entry name='uuid'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <entry name='family'>Virtual Machine</entry>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <boot dev='hd'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <smbios mode='sysinfo'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <vmcoreinfo state='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <cpu mode='custom' match='exact' check='full'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <model fallback='forbid'>Nehalem</model>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <feature policy='require' name='x2apic'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <feature policy='require' name='hypervisor'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <feature policy='require' name='vme'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <clock offset='utc'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <timer name='pit' tickpolicy='delay'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <timer name='hpet' present='no'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <on_poweroff>destroy</on_poweroff>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <on_reboot>restart</on_reboot>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <on_crash>destroy</on_crash>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <disk type='file' device='disk'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk' index='2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <backingStore type='file' index='3'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <format type='raw'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <source file='/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <backingStore/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      </backingStore>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='vda' bus='virtio'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='virtio-disk0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <disk type='file' device='cdrom'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='qemu' type='raw' cache='none'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config' index='1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <backingStore/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='sda' bus='sata'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <readonly/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='sata0-0-0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='0' model='pcie-root'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pcie.0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='1' port='0x10'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='2' port='0x11'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='3' port='0x12'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='4' port='0x13'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='5' port='0x14'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='6' port='0x15'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='7' port='0x16'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='8' port='0x17'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.8'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='9' port='0x18'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.9'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='10' port='0x19'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.10'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='11' port='0x1a'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.11'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='12' port='0x1b'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.12'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='13' port='0x1c'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.13'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='14' port='0x1d'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.14'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='15' port='0x1e'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.15'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='16' port='0x1f'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.16'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='17' port='0x20'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.17'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='18' port='0x21'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.18'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='19' port='0x22'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.19'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='20' port='0x23'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.20'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='21' port='0x24'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.21'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='22' port='0x25'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.22'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='23' port='0x26'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.23'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='24' port='0x27'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.24'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target chassis='25' port='0x28'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.25'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model name='pcie-pci-bridge'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='pci.26'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='usb'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <controller type='sata' index='0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='ide'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <interface type='ethernet'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <mac address='fa:16:3e:37:07:15'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target dev='tapc638bfcb-e1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model type='virtio'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <driver name='vhost' rx_queue_size='512'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <mtu size='1442'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='net0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <serial type='pty'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target type='isa-serial' port='0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:        <model name='isa-serial'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      </target>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <console type='pty' tty='/dev/pts/0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <target type='serial' port='0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </console>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <input type='tablet' bus='usb'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='input0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='usb' bus='0' port='1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <input type='mouse' bus='ps2'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='input1'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <input type='keyboard' bus='ps2'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='input2'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <listen type='address' address='::0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <audio id='1' type='none'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <model type='virtio' heads='1' primary='yes'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='video0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <watchdog model='itco' action='reset'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='watchdog0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </watchdog>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <memballoon model='virtio'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <stats period='10'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='balloon0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <rng model='virtio'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <backend model='random'>/dev/urandom</backend>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <alias name='rng0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <label>system_u:system_r:svirt_t:s0:c684,c998</label>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c684,c998</imagelabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <label>+107:+107</label>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <imagelabel>+107:+107</imagelabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.127 2 INFO nova.virt.libvirt.driver [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully detached device tap97c6d65d-ad from instance dfa10e04-6283-4c0a-94b0-6b4841e55401 from the live domain config.#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.128 2 DEBUG nova.virt.libvirt.vif [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.128 2 DEBUG nova.network.os_vif_util [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.129 2 DEBUG nova.network.os_vif_util [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.129 2 DEBUG os_vif [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.131 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97c6d65d-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.135 2 INFO os_vif [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad')#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.135 2 DEBUG nova.virt.libvirt.guest [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:42:36</nova:creationTime>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:42:36 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:36 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:42:36 np0005470441 nova_compute[192626]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  4 01:42:36 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [NOTICE]   (227541) : haproxy version is 2.8.14-c23fe91
Oct  4 01:42:36 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [NOTICE]   (227541) : path to executable is /usr/sbin/haproxy
Oct  4 01:42:36 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [WARNING]  (227541) : Exiting Master process...
Oct  4 01:42:36 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [ALERT]    (227541) : Current worker (227543) exited with code 143 (Terminated)
Oct  4 01:42:36 np0005470441 neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382[227537]: [WARNING]  (227541) : All workers exited. Exiting... (0)
Oct  4 01:42:36 np0005470441 systemd[1]: libpod-357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1.scope: Deactivated successfully.
Oct  4 01:42:36 np0005470441 conmon[227537]: conmon 357608e65fe5c5f2ce58 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1.scope/container/memory.events
Oct  4 01:42:36 np0005470441 podman[228064]: 2025-10-04 05:42:36.345631416 +0000 UTC m=+0.145489746 container died 357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:42:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1-userdata-shm.mount: Deactivated successfully.
Oct  4 01:42:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay-475e75a1610e0398cddae0a1fde46617db6c36c5ce83f95b3767f87bba5497e5-merged.mount: Deactivated successfully.
Oct  4 01:42:36 np0005470441 podman[228064]: 2025-10-04 05:42:36.597795854 +0000 UTC m=+0.397654144 container cleanup 357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  4 01:42:36 np0005470441 systemd[1]: libpod-conmon-357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1.scope: Deactivated successfully.
Oct  4 01:42:36 np0005470441 podman[228095]: 2025-10-04 05:42:36.873297215 +0000 UTC m=+0.244456680 container remove 357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.880 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f574c518-5d0f-4707-81a6-669f67c7226d]: (4, ('Sat Oct  4 05:42:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382 (357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1)\n357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1\nSat Oct  4 05:42:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382 (357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1)\n357608e65fe5c5f2ce5819eff482a9c536450df99b0dd2790488ee78693ba3a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.882 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bc0ccd92-dbc7-4e1c-bc7b-3caa5d4e7625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.883 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2bb6534-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 kernel: tapd2bb6534-e0: left promiscuous mode
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.892 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e86a32f0-cdb5-41ae-8430-cbf4f54cdbdd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.940 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[93b5be19-7fc6-4ce6-9ab7-98d2337d5f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.941 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c5f6f394-ba9f-4f7b-80f0-914738ca9821]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.949 2 DEBUG nova.compute.manager [req-2c49e342-f132-4b3e-b42b-a3bffd5ac75f req-956c8413-7514-4a94-876c-b77fea556718 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-unplugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.950 2 DEBUG oslo_concurrency.lockutils [req-2c49e342-f132-4b3e-b42b-a3bffd5ac75f req-956c8413-7514-4a94-876c-b77fea556718 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.950 2 DEBUG oslo_concurrency.lockutils [req-2c49e342-f132-4b3e-b42b-a3bffd5ac75f req-956c8413-7514-4a94-876c-b77fea556718 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.950 2 DEBUG oslo_concurrency.lockutils [req-2c49e342-f132-4b3e-b42b-a3bffd5ac75f req-956c8413-7514-4a94-876c-b77fea556718 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.950 2 DEBUG nova.compute.manager [req-2c49e342-f132-4b3e-b42b-a3bffd5ac75f req-956c8413-7514-4a94-876c-b77fea556718 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-unplugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:42:36 np0005470441 nova_compute[192626]: 2025-10-04 05:42:36.950 2 WARNING nova.compute.manager [req-2c49e342-f132-4b3e-b42b-a3bffd5ac75f req-956c8413-7514-4a94-876c-b77fea556718 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received unexpected event network-vif-unplugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd for instance with vm_state active and task_state None.#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.965 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[157bf382-83c8-4a2a-aa89-c30d7235d100]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 438365, 'reachable_time': 34890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228110, 'error': None, 'target': 'ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.968 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d2bb6534-edcf-4f06-bd53-7102e28ef382 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:42:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:36.968 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[9ae83c1d-6347-4a0b-963a-1bd97eff1e0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:36 np0005470441 systemd[1]: run-netns-ovnmeta\x2dd2bb6534\x2dedcf\x2d4f06\x2dbd53\x2d7102e28ef382.mount: Deactivated successfully.
Oct  4 01:42:37 np0005470441 nova_compute[192626]: 2025-10-04 05:42:37.507 2 DEBUG oslo_concurrency.lockutils [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:42:37 np0005470441 nova_compute[192626]: 2025-10-04 05:42:37.507 2 DEBUG oslo_concurrency.lockutils [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:42:37 np0005470441 nova_compute[192626]: 2025-10-04 05:42:37.508 2 DEBUG nova.network.neutron [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:42:37 np0005470441 nova_compute[192626]: 2025-10-04 05:42:37.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:37 np0005470441 nova_compute[192626]: 2025-10-04 05:42:37.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.060 2 DEBUG nova.compute.manager [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.060 2 DEBUG oslo_concurrency.lockutils [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.060 2 DEBUG oslo_concurrency.lockutils [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.061 2 DEBUG oslo_concurrency.lockutils [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.061 2 DEBUG nova.compute.manager [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.061 2 WARNING nova.compute.manager [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received unexpected event network-vif-plugged-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd for instance with vm_state active and task_state None.#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.061 2 DEBUG nova.compute.manager [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-deleted-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.062 2 INFO nova.compute.manager [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Neutron deleted interface 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd; detaching it from the instance and deleting it from the info cache#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.062 2 DEBUG nova.network.neutron [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.095 2 DEBUG nova.objects.instance [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lazy-loading 'system_metadata' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.131 2 DEBUG nova.objects.instance [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lazy-loading 'flavor' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.159 2 DEBUG nova.virt.libvirt.vif [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.160 2 DEBUG nova.network.os_vif_util [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Converting VIF {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.160 2 DEBUG nova.network.os_vif_util [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.164 2 DEBUG nova.virt.libvirt.guest [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.167 2 DEBUG nova.virt.libvirt.guest [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface>not found in domain: <domain type='kvm' id='17'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <name>instance-0000001f</name>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <uuid>dfa10e04-6283-4c0a-94b0-6b4841e55401</uuid>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:42:36</nova:creationTime>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <memory unit='KiB'>131072</memory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <vcpu placement='static'>1</vcpu>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <resource>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <partition>/machine</partition>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </resource>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <sysinfo type='smbios'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='manufacturer'>RDO</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='product'>OpenStack Compute</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='serial'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='uuid'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='family'>Virtual Machine</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <boot dev='hd'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <smbios mode='sysinfo'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <vmcoreinfo state='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <cpu mode='custom' match='exact' check='full'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <model fallback='forbid'>Nehalem</model>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <feature policy='require' name='x2apic'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <feature policy='require' name='hypervisor'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <feature policy='require' name='vme'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <clock offset='utc'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <timer name='pit' tickpolicy='delay'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <timer name='hpet' present='no'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <on_poweroff>destroy</on_poweroff>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <on_reboot>restart</on_reboot>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <on_crash>destroy</on_crash>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <disk type='file' device='disk'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk' index='2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <backingStore type='file' index='3'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <format type='raw'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <source file='/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <backingStore/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      </backingStore>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target dev='vda' bus='virtio'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='virtio-disk0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <disk type='file' device='cdrom'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <driver name='qemu' type='raw' cache='none'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config' index='1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <backingStore/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target dev='sda' bus='sata'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <readonly/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='sata0-0-0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='0' model='pcie-root'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pcie.0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='1' port='0x10'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='2' port='0x11'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='3' port='0x12'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='4' port='0x13'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='5' port='0x14'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='6' port='0x15'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='7' port='0x16'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='8' port='0x17'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.8'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='9' port='0x18'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.9'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='10' port='0x19'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.10'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='11' port='0x1a'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.11'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='12' port='0x1b'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.12'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='13' port='0x1c'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.13'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='14' port='0x1d'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.14'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='15' port='0x1e'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.15'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='16' port='0x1f'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.16'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='17' port='0x20'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.17'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='18' port='0x21'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.18'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='19' port='0x22'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.19'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='20' port='0x23'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.20'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='21' port='0x24'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.21'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='22' port='0x25'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.22'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='23' port='0x26'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.23'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='24' port='0x27'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.24'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='25' port='0x28'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.25'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-pci-bridge'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.26'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='usb'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='sata' index='0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='ide'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <interface type='ethernet'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <mac address='fa:16:3e:37:07:15'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target dev='tapc638bfcb-e1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model type='virtio'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <driver name='vhost' rx_queue_size='512'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <mtu size='1442'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='net0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <serial type='pty'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target type='isa-serial' port='0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <model name='isa-serial'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      </target>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <console type='pty' tty='/dev/pts/0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target type='serial' port='0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </console>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <input type='tablet' bus='usb'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='input0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='usb' bus='0' port='1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <input type='mouse' bus='ps2'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='input1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <input type='keyboard' bus='ps2'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='input2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <listen type='address' address='::0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <audio id='1' type='none'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model type='virtio' heads='1' primary='yes'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='video0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <watchdog model='itco' action='reset'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='watchdog0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </watchdog>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <memballoon model='virtio'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <stats period='10'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='balloon0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <rng model='virtio'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <backend model='random'>/dev/urandom</backend>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='rng0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <label>system_u:system_r:svirt_t:s0:c684,c998</label>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c684,c998</imagelabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <label>+107:+107</label>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <imagelabel>+107:+107</imagelabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.167 2 DEBUG nova.virt.libvirt.guest [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.170 2 DEBUG nova.virt.libvirt.guest [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:0d:03:22"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap97c6d65d-ad"/></interface>not found in domain: <domain type='kvm' id='17'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <name>instance-0000001f</name>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <uuid>dfa10e04-6283-4c0a-94b0-6b4841e55401</uuid>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:42:36</nova:creationTime>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <memory unit='KiB'>131072</memory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <vcpu placement='static'>1</vcpu>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <resource>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <partition>/machine</partition>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </resource>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <sysinfo type='smbios'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='manufacturer'>RDO</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='product'>OpenStack Compute</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='serial'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='uuid'>dfa10e04-6283-4c0a-94b0-6b4841e55401</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <entry name='family'>Virtual Machine</entry>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <boot dev='hd'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <smbios mode='sysinfo'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <vmcoreinfo state='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <cpu mode='custom' match='exact' check='full'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <model fallback='forbid'>Nehalem</model>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <feature policy='require' name='x2apic'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <feature policy='require' name='hypervisor'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <feature policy='require' name='vme'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <clock offset='utc'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <timer name='pit' tickpolicy='delay'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <timer name='hpet' present='no'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <on_poweroff>destroy</on_poweroff>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <on_reboot>restart</on_reboot>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <on_crash>destroy</on_crash>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <disk type='file' device='disk'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk' index='2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <backingStore type='file' index='3'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <format type='raw'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <source file='/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <backingStore/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      </backingStore>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target dev='vda' bus='virtio'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='virtio-disk0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <disk type='file' device='cdrom'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <driver name='qemu' type='raw' cache='none'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/disk.config' index='1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <backingStore/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target dev='sda' bus='sata'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <readonly/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='sata0-0-0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='0' model='pcie-root'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pcie.0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='1' port='0x10'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='2' port='0x11'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='3' port='0x12'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='4' port='0x13'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='5' port='0x14'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='6' port='0x15'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='7' port='0x16'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='8' port='0x17'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.8'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='9' port='0x18'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.9'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='10' port='0x19'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.10'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='11' port='0x1a'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.11'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='12' port='0x1b'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.12'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='13' port='0x1c'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.13'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='14' port='0x1d'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.14'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='15' port='0x1e'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.15'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='16' port='0x1f'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.16'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='17' port='0x20'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.17'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='18' port='0x21'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.18'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='19' port='0x22'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.19'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='20' port='0x23'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.20'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='21' port='0x24'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.21'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='22' port='0x25'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.22'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='23' port='0x26'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.23'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='24' port='0x27'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.24'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-root-port'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target chassis='25' port='0x28'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.25'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model name='pcie-pci-bridge'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='pci.26'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='usb'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <controller type='sata' index='0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='ide'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </controller>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <interface type='ethernet'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <mac address='fa:16:3e:37:07:15'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target dev='tapc638bfcb-e1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model type='virtio'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <driver name='vhost' rx_queue_size='512'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <mtu size='1442'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='net0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <serial type='pty'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target type='isa-serial' port='0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:        <model name='isa-serial'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      </target>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <console type='pty' tty='/dev/pts/0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <source path='/dev/pts/0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <log file='/var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401/console.log' append='off'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <target type='serial' port='0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='serial0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </console>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <input type='tablet' bus='usb'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='input0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='usb' bus='0' port='1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <input type='mouse' bus='ps2'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='input1'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <input type='keyboard' bus='ps2'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='input2'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </input>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <listen type='address' address='::0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </graphics>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <audio id='1' type='none'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <model type='virtio' heads='1' primary='yes'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='video0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <watchdog model='itco' action='reset'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='watchdog0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </watchdog>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <memballoon model='virtio'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <stats period='10'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='balloon0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <rng model='virtio'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <backend model='random'>/dev/urandom</backend>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <alias name='rng0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <label>system_u:system_r:svirt_t:s0:c684,c998</label>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c684,c998</imagelabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <label>+107:+107</label>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <imagelabel>+107:+107</imagelabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </seclabel>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.171 2 WARNING nova.virt.libvirt.driver [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Detaching interface fa:16:3e:0d:03:22 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap97c6d65d-ad' not found.#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.171 2 DEBUG nova.virt.libvirt.vif [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.171 2 DEBUG nova.network.os_vif_util [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Converting VIF {"id": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "address": "fa:16:3e:0d:03:22", "network": {"id": "d2bb6534-edcf-4f06-bd53-7102e28ef382", "bridge": "br-int", "label": "tempest-network-smoke--1150424479", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap97c6d65d-ad", "ovs_interfaceid": "97c6d65d-ad9d-4e43-a41c-301d4df8ccdd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.172 2 DEBUG nova.network.os_vif_util [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.172 2 DEBUG os_vif [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap97c6d65d-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.174 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.176 2 INFO os_vif [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0d:03:22,bridge_name='br-int',has_traffic_filtering=True,id=97c6d65d-ad9d-4e43-a41c-301d4df8ccdd,network=Network(d2bb6534-edcf-4f06-bd53-7102e28ef382),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap97c6d65d-ad')#033[00m
Oct  4 01:42:39 np0005470441 nova_compute[192626]: 2025-10-04 05:42:39.177 2 DEBUG nova.virt.libvirt.guest [req-1c1e1ebe-1388-4600-b360-d172dfa1cdbd req-30194e9b-359d-49c9-a1dc-a4215da5b0e5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:name>tempest-TestNetworkBasicOps-server-37334901</nova:name>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:creationTime>2025-10-04 05:42:39</nova:creationTime>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:flavor name="m1.nano">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:memory>128</nova:memory>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:disk>1</nova:disk>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:swap>0</nova:swap>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:vcpus>1</nova:vcpus>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:flavor>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:owner>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:owner>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  <nova:ports>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    <nova:port uuid="c638bfcb-e144-4a6d-9626-9ae28c1a6437">
Oct  4 01:42:39 np0005470441 nova_compute[192626]:      <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:    </nova:port>
Oct  4 01:42:39 np0005470441 nova_compute[192626]:  </nova:ports>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: </nova:instance>
Oct  4 01:42:39 np0005470441 nova_compute[192626]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  4 01:42:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:40Z|00254|binding|INFO|Releasing lport 224f2340-bf1e-48df-8648-a854ae221536 from this chassis (sb_readonly=0)
Oct  4 01:42:40 np0005470441 nova_compute[192626]: 2025-10-04 05:42:40.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:40 np0005470441 podman[228111]: 2025-10-04 05:42:40.342650885 +0000 UTC m=+0.067537430 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:42:40 np0005470441 nova_compute[192626]: 2025-10-04 05:42:40.909 2 INFO nova.network.neutron [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Port 97c6d65d-ad9d-4e43-a41c-301d4df8ccdd from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  4 01:42:40 np0005470441 nova_compute[192626]: 2025-10-04 05:42:40.910 2 DEBUG nova.network.neutron [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:40 np0005470441 nova_compute[192626]: 2025-10-04 05:42:40.935 2 DEBUG oslo_concurrency.lockutils [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:42:40 np0005470441 nova_compute[192626]: 2025-10-04 05:42:40.962 2 DEBUG oslo_concurrency.lockutils [None req-7eefe0f3-3d0d-4a2e-8ce4-83bc4f08c131 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "interface-dfa10e04-6283-4c0a-94b0-6b4841e55401-97c6d65d-ad9d-4e43-a41c-301d4df8ccdd" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:41 np0005470441 nova_compute[192626]: 2025-10-04 05:42:41.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:41 np0005470441 nova_compute[192626]: 2025-10-04 05:42:41.912 2 DEBUG nova.compute.manager [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-changed-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:41 np0005470441 nova_compute[192626]: 2025-10-04 05:42:41.913 2 DEBUG nova.compute.manager [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing instance network info cache due to event network-changed-c638bfcb-e144-4a6d-9626-9ae28c1a6437. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:42:41 np0005470441 nova_compute[192626]: 2025-10-04 05:42:41.917 2 DEBUG oslo_concurrency.lockutils [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:42:41 np0005470441 nova_compute[192626]: 2025-10-04 05:42:41.917 2 DEBUG oslo_concurrency.lockutils [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:42:41 np0005470441 nova_compute[192626]: 2025-10-04 05:42:41.917 2 DEBUG nova.network.neutron [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Refreshing network info cache for port c638bfcb-e144-4a6d-9626-9ae28c1a6437 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.048 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.049 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.049 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.049 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.050 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.052 2 INFO nova.compute.manager [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Terminating instance#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.053 2 DEBUG nova.compute.manager [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:42:42 np0005470441 kernel: tapc638bfcb-e1 (unregistering): left promiscuous mode
Oct  4 01:42:42 np0005470441 NetworkManager[51690]: <info>  [1759556562.0813] device (tapc638bfcb-e1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:42:42 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:42Z|00255|binding|INFO|Releasing lport c638bfcb-e144-4a6d-9626-9ae28c1a6437 from this chassis (sb_readonly=0)
Oct  4 01:42:42 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:42Z|00256|binding|INFO|Setting lport c638bfcb-e144-4a6d-9626-9ae28c1a6437 down in Southbound
Oct  4 01:42:42 np0005470441 ovn_controller[94840]: 2025-10-04T05:42:42Z|00257|binding|INFO|Removing iface tapc638bfcb-e1 ovn-installed in OVS
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.147 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:07:15 10.100.0.8'], port_security=['fa:16:3e:37:07:15 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'dfa10e04-6283-4c0a-94b0-6b4841e55401', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'aa9609d0-80fa-4d9b-8c8b-3ac9c8b42178', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aa7a6f0-a5c6-4212-9ae8-0a0cc5360bf3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=c638bfcb-e144-4a6d-9626-9ae28c1a6437) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.148 103689 INFO neutron.agent.ovn.metadata.agent [-] Port c638bfcb-e144-4a6d-9626-9ae28c1a6437 in datapath a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 unbound from our chassis#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.150 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.151 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0e0cd9-8787-4311-9b39-3974f9a3ebe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.152 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 namespace which is not needed anymore#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Oct  4 01:42:42 np0005470441 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000001f.scope: Consumed 17.765s CPU time.
Oct  4 01:42:42 np0005470441 systemd-machined[152624]: Machine qemu-17-instance-0000001f terminated.
Oct  4 01:42:42 np0005470441 podman[228137]: 2025-10-04 05:42:42.233694196 +0000 UTC m=+0.077312589 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [NOTICE]   (227207) : haproxy version is 2.8.14-c23fe91
Oct  4 01:42:42 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [NOTICE]   (227207) : path to executable is /usr/sbin/haproxy
Oct  4 01:42:42 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [WARNING]  (227207) : Exiting Master process...
Oct  4 01:42:42 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [WARNING]  (227207) : Exiting Master process...
Oct  4 01:42:42 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [ALERT]    (227207) : Current worker (227209) exited with code 143 (Terminated)
Oct  4 01:42:42 np0005470441 neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9[227203]: [WARNING]  (227207) : All workers exited. Exiting... (0)
Oct  4 01:42:42 np0005470441 systemd[1]: libpod-73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550.scope: Deactivated successfully.
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.332 2 INFO nova.virt.libvirt.driver [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Instance destroyed successfully.#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.332 2 DEBUG nova.objects.instance [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid dfa10e04-6283-4c0a-94b0-6b4841e55401 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:42:42 np0005470441 podman[228181]: 2025-10-04 05:42:42.333836452 +0000 UTC m=+0.063728782 container died 73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.353 2 DEBUG nova.virt.libvirt.vif [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-37334901',display_name='tempest-TestNetworkBasicOps-server-37334901',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-37334901',id=31,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLtYzD4e1KCN/z4uLxLj7izbsxqZJxh6Pv31gcfsfjPnCvVlMMrgonudJHqjt6R8+FcFXZzFbf7DQnEhzC0ZzChpdbvO1/pkiXuY2oVpqEvJDzU9xn2ZyA+8qwyHwLh75Q==',key_name='tempest-TestNetworkBasicOps-1147181905',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:40:54Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-oe314469',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:40:54Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=dfa10e04-6283-4c0a-94b0-6b4841e55401,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.355 2 DEBUG nova.network.os_vif_util [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.228", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.357 2 DEBUG nova.network.os_vif_util [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.357 2 DEBUG os_vif [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.365 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc638bfcb-e1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:42 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550-userdata-shm.mount: Deactivated successfully.
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:42:42 np0005470441 systemd[1]: var-lib-containers-storage-overlay-6f63165a507e8a7e46f165a9d636c347c68924094d5be7a88cbf471d6239fcab-merged.mount: Deactivated successfully.
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.383 2 INFO os_vif [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:37:07:15,bridge_name='br-int',has_traffic_filtering=True,id=c638bfcb-e144-4a6d-9626-9ae28c1a6437,network=Network(a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc638bfcb-e1')#033[00m
Oct  4 01:42:42 np0005470441 podman[228181]: 2025-10-04 05:42:42.383813013 +0000 UTC m=+0.113705363 container cleanup 73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.383 2 INFO nova.virt.libvirt.driver [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Deleting instance files /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401_del#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.384 2 INFO nova.virt.libvirt.driver [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Deletion of /var/lib/nova/instances/dfa10e04-6283-4c0a-94b0-6b4841e55401_del complete#033[00m
Oct  4 01:42:42 np0005470441 systemd[1]: libpod-conmon-73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550.scope: Deactivated successfully.
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.468 2 INFO nova.compute.manager [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.468 2 DEBUG oslo.service.loopingcall [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.469 2 DEBUG nova.compute.manager [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.469 2 DEBUG nova.network.neutron [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:42:42 np0005470441 podman[228225]: 2025-10-04 05:42:42.47097296 +0000 UTC m=+0.053396469 container remove 73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.479 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5fbacf69-a007-44ea-993d-05bf98637b51]: (4, ('Sat Oct  4 05:42:42 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 (73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550)\n73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550\nSat Oct  4 05:42:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 (73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550)\n73c68a0efa52223cd2d2ccdafc4d5d0942684dceb5180dfb4ae9b80db49f3550\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.480 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[18d26d66-7f13-4877-a092-cad1c585e669]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.481 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1a4f623-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 kernel: tapa1a4f623-10: left promiscuous mode
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.502 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9c17cc89-415e-4346-8451-4d4a26748d12]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.532 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3fa80cde-3770-4a48-9f30-d3ce6c77f1ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.534 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[48e9722e-fc79-4f20-8194-1cd72ecb1a6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.549 2 DEBUG nova.compute.manager [req-0ffbfa7a-b250-4812-8a33-471c9668ad2d req-bdaaae5b-6fa2-406a-83d8-544a155cdd2e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-unplugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.550 2 DEBUG oslo_concurrency.lockutils [req-0ffbfa7a-b250-4812-8a33-471c9668ad2d req-bdaaae5b-6fa2-406a-83d8-544a155cdd2e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.550 2 DEBUG oslo_concurrency.lockutils [req-0ffbfa7a-b250-4812-8a33-471c9668ad2d req-bdaaae5b-6fa2-406a-83d8-544a155cdd2e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.550 2 DEBUG oslo_concurrency.lockutils [req-0ffbfa7a-b250-4812-8a33-471c9668ad2d req-bdaaae5b-6fa2-406a-83d8-544a155cdd2e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.551 2 DEBUG nova.compute.manager [req-0ffbfa7a-b250-4812-8a33-471c9668ad2d req-bdaaae5b-6fa2-406a-83d8-544a155cdd2e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-unplugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.551 2 DEBUG nova.compute.manager [req-0ffbfa7a-b250-4812-8a33-471c9668ad2d req-bdaaae5b-6fa2-406a-83d8-544a155cdd2e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-unplugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.554 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[67bc09c3-12e6-48d6-a161-3415ec5a4458]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434962, 'reachable_time': 44573, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228240, 'error': None, 'target': 'ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.556 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:42:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:42:42.556 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b27fe4-2e2f-41e2-9a7b-eda6ceb08ff0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:42:42 np0005470441 systemd[1]: run-netns-ovnmeta\x2da1a4f623\x2d1865\x2d45eb\x2d9f92\x2d6e6b0b4fe8b9.mount: Deactivated successfully.
Oct  4 01:42:42 np0005470441 nova_compute[192626]: 2025-10-04 05:42:42.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:43 np0005470441 nova_compute[192626]: 2025-10-04 05:42:43.827 2 DEBUG nova.network.neutron [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:43 np0005470441 nova_compute[192626]: 2025-10-04 05:42:43.845 2 INFO nova.compute.manager [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Took 1.38 seconds to deallocate network for instance.#033[00m
Oct  4 01:42:43 np0005470441 nova_compute[192626]: 2025-10-04 05:42:43.927 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:43 np0005470441 nova_compute[192626]: 2025-10-04 05:42:43.927 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.018 2 DEBUG nova.compute.provider_tree [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.045 2 DEBUG nova.scheduler.client.report [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.074 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.106 2 DEBUG nova.network.neutron [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updated VIF entry in instance network info cache for port c638bfcb-e144-4a6d-9626-9ae28c1a6437. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.107 2 DEBUG nova.network.neutron [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Updating instance_info_cache with network_info: [{"id": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "address": "fa:16:3e:37:07:15", "network": {"id": "a1a4f623-1865-45eb-9f92-6e6b0b4fe8b9", "bridge": "br-int", "label": "tempest-network-smoke--1126375916", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc638bfcb-e1", "ovs_interfaceid": "c638bfcb-e144-4a6d-9626-9ae28c1a6437", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.111 2 INFO nova.scheduler.client.report [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance dfa10e04-6283-4c0a-94b0-6b4841e55401#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.141 2 DEBUG oslo_concurrency.lockutils [req-13ef9f8a-3101-4318-aca5-031a269fcc01 req-588d8a5e-942f-4914-96c1-5a345bf437dc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-dfa10e04-6283-4c0a-94b0-6b4841e55401" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.243 2 DEBUG oslo_concurrency.lockutils [None req-b644d74f-2468-460b-b735-9a76bb50cf80 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.195s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.922 2 DEBUG nova.compute.manager [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.922 2 DEBUG oslo_concurrency.lockutils [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.922 2 DEBUG oslo_concurrency.lockutils [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.923 2 DEBUG oslo_concurrency.lockutils [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "dfa10e04-6283-4c0a-94b0-6b4841e55401-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.923 2 DEBUG nova.compute.manager [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] No waiting events found dispatching network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.923 2 WARNING nova.compute.manager [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received unexpected event network-vif-plugged-c638bfcb-e144-4a6d-9626-9ae28c1a6437 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.923 2 DEBUG nova.compute.manager [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Received event network-vif-deleted-c638bfcb-e144-4a6d-9626-9ae28c1a6437 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.923 2 INFO nova.compute.manager [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Neutron deleted interface c638bfcb-e144-4a6d-9626-9ae28c1a6437; detaching it from the instance and deleting it from the info cache#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.924 2 DEBUG nova.network.neutron [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  4 01:42:44 np0005470441 nova_compute[192626]: 2025-10-04 05:42:44.926 2 DEBUG nova.compute.manager [req-e00ae392-3620-40a3-aff0-71185dd40503 req-5a23e458-37b0-4c60-983a-07031bb63675 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Detach interface failed, port_id=c638bfcb-e144-4a6d-9626-9ae28c1a6437, reason: Instance dfa10e04-6283-4c0a-94b0-6b4841e55401 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  4 01:42:45 np0005470441 podman[228241]: 2025-10-04 05:42:45.428521734 +0000 UTC m=+0.178806813 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:42:47 np0005470441 nova_compute[192626]: 2025-10-04 05:42:47.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:47 np0005470441 nova_compute[192626]: 2025-10-04 05:42:47.375 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556552.3743784, 9656898e-1d93-434d-88db-975744a112d3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:42:47 np0005470441 nova_compute[192626]: 2025-10-04 05:42:47.375 2 INFO nova.compute.manager [-] [instance: 9656898e-1d93-434d-88db-975744a112d3] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:42:47 np0005470441 nova_compute[192626]: 2025-10-04 05:42:47.394 2 DEBUG nova.compute.manager [None req-3061ab1f-6065-4142-b0ee-3022e3541daa - - - - - -] [instance: 9656898e-1d93-434d-88db-975744a112d3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:42:47 np0005470441 nova_compute[192626]: 2025-10-04 05:42:47.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:48 np0005470441 nova_compute[192626]: 2025-10-04 05:42:48.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:48 np0005470441 nova_compute[192626]: 2025-10-04 05:42:48.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:52 np0005470441 nova_compute[192626]: 2025-10-04 05:42:52.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:52 np0005470441 nova_compute[192626]: 2025-10-04 05:42:52.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:54 np0005470441 podman[228269]: 2025-10-04 05:42:54.303005979 +0000 UTC m=+0.057842006 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct  4 01:42:54 np0005470441 podman[228270]: 2025-10-04 05:42:54.304689256 +0000 UTC m=+0.053683126 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:42:57 np0005470441 nova_compute[192626]: 2025-10-04 05:42:57.327 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556562.3256092, dfa10e04-6283-4c0a-94b0-6b4841e55401 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:42:57 np0005470441 nova_compute[192626]: 2025-10-04 05:42:57.328 2 INFO nova.compute.manager [-] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:42:57 np0005470441 nova_compute[192626]: 2025-10-04 05:42:57.378 2 DEBUG nova.compute.manager [None req-c1bc92f2-b4ce-4da0-8747-5158d2f2096b - - - - - -] [instance: dfa10e04-6283-4c0a-94b0-6b4841e55401] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:42:57 np0005470441 nova_compute[192626]: 2025-10-04 05:42:57.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:57 np0005470441 nova_compute[192626]: 2025-10-04 05:42:57.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:42:58 np0005470441 podman[228313]: 2025-10-04 05:42:58.300630616 +0000 UTC m=+0.055128198 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:42:58 np0005470441 podman[228314]: 2025-10-04 05:42:58.320319706 +0000 UTC m=+0.069495757 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:43:00 np0005470441 nova_compute[192626]: 2025-10-04 05:43:00.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:02 np0005470441 nova_compute[192626]: 2025-10-04 05:43:02.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:02 np0005470441 nova_compute[192626]: 2025-10-04 05:43:02.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:02 np0005470441 nova_compute[192626]: 2025-10-04 05:43:02.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:03 np0005470441 nova_compute[192626]: 2025-10-04 05:43:03.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:03 np0005470441 nova_compute[192626]: 2025-10-04 05:43:03.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:43:06 np0005470441 podman[228352]: 2025-10-04 05:43:06.298592824 +0000 UTC m=+0.057242108 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.741 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.741 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:06.750 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:06.751 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:06.751 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.768 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.768 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.769 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.769 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.933 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.934 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5776MB free_disk=73.42936706542969GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.934 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:06 np0005470441 nova_compute[192626]: 2025-10-04 05:43:06.935 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.000 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.001 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.024 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.042 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.064 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.064 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:07 np0005470441 nova_compute[192626]: 2025-10-04 05:43:07.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:09 np0005470441 nova_compute[192626]: 2025-10-04 05:43:09.039 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:09 np0005470441 nova_compute[192626]: 2025-10-04 05:43:09.040 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:10 np0005470441 nova_compute[192626]: 2025-10-04 05:43:10.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:11 np0005470441 podman[228375]: 2025-10-04 05:43:11.297272734 +0000 UTC m=+0.050736683 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.507 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.507 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.527 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.609 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.609 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.617 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.618 2 INFO nova.compute.claims [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.762 2 DEBUG nova.compute.provider_tree [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.783 2 DEBUG nova.scheduler.client.report [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.817 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.818 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.890 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.891 2 DEBUG nova.network.neutron [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.918 2 INFO nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:43:11 np0005470441 nova_compute[192626]: 2025-10-04 05:43:11.951 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.052 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.054 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.055 2 INFO nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Creating image(s)#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.056 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.056 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.058 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.085 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.144 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.145 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.146 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.162 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.220 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.221 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.256 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.257 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.257 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.309 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.311 2 DEBUG nova.virt.disk.api [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.312 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.384 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.385 2 DEBUG nova.virt.disk.api [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.386 2 DEBUG nova.objects.instance [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 55e90e51-a248-4cbd-b153-f507c41f4982 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.402 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.403 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Ensure instance console log exists: /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.403 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.403 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.404 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:12 np0005470441 nova_compute[192626]: 2025-10-04 05:43:12.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:13 np0005470441 podman[228414]: 2025-10-04 05:43:13.301380658 +0000 UTC m=+0.056884837 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  4 01:43:13 np0005470441 nova_compute[192626]: 2025-10-04 05:43:13.843 2 DEBUG nova.policy [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:43:15 np0005470441 nova_compute[192626]: 2025-10-04 05:43:15.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:43:16 np0005470441 podman[228433]: 2025-10-04 05:43:16.353766167 +0000 UTC m=+0.103817742 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:43:16 np0005470441 nova_compute[192626]: 2025-10-04 05:43:16.919 2 DEBUG nova.network.neutron [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Successfully updated port: a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:43:16 np0005470441 nova_compute[192626]: 2025-10-04 05:43:16.978 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:43:16 np0005470441 nova_compute[192626]: 2025-10-04 05:43:16.978 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:43:16 np0005470441 nova_compute[192626]: 2025-10-04 05:43:16.979 2 DEBUG nova.network.neutron [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:43:17 np0005470441 nova_compute[192626]: 2025-10-04 05:43:17.043 2 DEBUG nova.compute.manager [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-changed-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:17 np0005470441 nova_compute[192626]: 2025-10-04 05:43:17.044 2 DEBUG nova.compute.manager [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Refreshing instance network info cache due to event network-changed-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:43:17 np0005470441 nova_compute[192626]: 2025-10-04 05:43:17.044 2 DEBUG oslo_concurrency.lockutils [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:43:17 np0005470441 nova_compute[192626]: 2025-10-04 05:43:17.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:17 np0005470441 nova_compute[192626]: 2025-10-04 05:43:17.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:17 np0005470441 nova_compute[192626]: 2025-10-04 05:43:17.803 2 DEBUG nova.network.neutron [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.002 2 DEBUG nova.network.neutron [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Updating instance_info_cache with network_info: [{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.113 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.113 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Instance network_info: |[{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.114 2 DEBUG oslo_concurrency.lockutils [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.114 2 DEBUG nova.network.neutron [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Refreshing network info cache for port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.119 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Start _get_guest_xml network_info=[{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.122 2 WARNING nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.127 2 DEBUG nova.virt.libvirt.host [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.127 2 DEBUG nova.virt.libvirt.host [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.130 2 DEBUG nova.virt.libvirt.host [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.130 2 DEBUG nova.virt.libvirt.host [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.131 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.132 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.132 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.132 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.133 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.133 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.133 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.133 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.134 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.134 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.134 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.134 2 DEBUG nova.virt.hardware [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.138 2 DEBUG nova.virt.libvirt.vif [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:43:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1526462671',display_name='tempest-TestNetworkBasicOps-server-1526462671',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1526462671',id=35,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPt3qWalml7mk52pdQOZFDl2/UT2LXksJHHQB9r19ITU9pHmiVHK4ixsUJIa2RQmj4aEJey/Oe+1LgPsEgu0sKZdSH9G47lOUSajsBnCcx2gF8dubrOZiF6m7GhxuefL1g==',key_name='tempest-TestNetworkBasicOps-1776623195',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-dx9042t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:43:11Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=55e90e51-a248-4cbd-b153-f507c41f4982,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.138 2 DEBUG nova.network.os_vif_util [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.139 2 DEBUG nova.network.os_vif_util [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.139 2 DEBUG nova.objects.instance [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 55e90e51-a248-4cbd-b153-f507c41f4982 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.189 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <uuid>55e90e51-a248-4cbd-b153-f507c41f4982</uuid>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <name>instance-00000023</name>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-1526462671</nova:name>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:43:20</nova:creationTime>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        <nova:port uuid="a1a812d2-e8c0-4cd3-82b3-d1877f068ca7">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <entry name="serial">55e90e51-a248-4cbd-b153-f507c41f4982</entry>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <entry name="uuid">55e90e51-a248-4cbd-b153-f507c41f4982</entry>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.config"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:97:d2:b0"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <target dev="tapa1a812d2-e8"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/console.log" append="off"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:43:20 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:43:20 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:43:20 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:43:20 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.190 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Preparing to wait for external event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.191 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.191 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.192 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.193 2 DEBUG nova.virt.libvirt.vif [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:43:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1526462671',display_name='tempest-TestNetworkBasicOps-server-1526462671',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1526462671',id=35,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPt3qWalml7mk52pdQOZFDl2/UT2LXksJHHQB9r19ITU9pHmiVHK4ixsUJIa2RQmj4aEJey/Oe+1LgPsEgu0sKZdSH9G47lOUSajsBnCcx2gF8dubrOZiF6m7GhxuefL1g==',key_name='tempest-TestNetworkBasicOps-1776623195',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-dx9042t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:43:11Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=55e90e51-a248-4cbd-b153-f507c41f4982,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.193 2 DEBUG nova.network.os_vif_util [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.194 2 DEBUG nova.network.os_vif_util [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.194 2 DEBUG os_vif [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.196 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.196 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.200 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1a812d2-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.201 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa1a812d2-e8, col_values=(('external_ids', {'iface-id': 'a1a812d2-e8c0-4cd3-82b3-d1877f068ca7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:d2:b0', 'vm-uuid': '55e90e51-a248-4cbd-b153-f507c41f4982'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:20 np0005470441 NetworkManager[51690]: <info>  [1759556600.2038] manager: (tapa1a812d2-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.211 2 INFO os_vif [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8')#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.348 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.349 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.349 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:97:d2:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:43:20 np0005470441 nova_compute[192626]: 2025-10-04 05:43:20.350 2 INFO nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Using config drive#033[00m
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.005 2 INFO nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Creating config drive at /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.config#033[00m
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.010 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4enmod8i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.140 2 DEBUG oslo_concurrency.processutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4enmod8i" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:21 np0005470441 kernel: tapa1a812d2-e8: entered promiscuous mode
Oct  4 01:43:21 np0005470441 NetworkManager[51690]: <info>  [1759556601.2440] manager: (tapa1a812d2-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/114)
Oct  4 01:43:21 np0005470441 systemd-udevd[228476]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:43:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:21Z|00258|binding|INFO|Claiming lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for this chassis.
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:21Z|00259|binding|INFO|a1a812d2-e8c0-4cd3-82b3-d1877f068ca7: Claiming fa:16:3e:97:d2:b0 10.100.0.6
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:21 np0005470441 NetworkManager[51690]: <info>  [1759556601.3265] device (tapa1a812d2-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:43:21 np0005470441 NetworkManager[51690]: <info>  [1759556601.3279] device (tapa1a812d2-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:43:21 np0005470441 systemd-machined[152624]: New machine qemu-19-instance-00000023.
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.366 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:d2:b0 10.100.0.6'], port_security=['fa:16:3e:97:d2:b0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '55e90e51-a248-4cbd-b153-f507c41f4982', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d97328c8-d09e-4ace-8909-f59582083038', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'af6e9f85-2a38-4738-916c-2544b03b6a65', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f07a0568-5429-4fa1-87e0-fb13bf3492a0, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:43:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:21Z|00260|binding|INFO|Setting lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 ovn-installed in OVS
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.368 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 in datapath d97328c8-d09e-4ace-8909-f59582083038 bound to our chassis#033[00m
Oct  4 01:43:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:21Z|00261|binding|INFO|Setting lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 up in Southbound
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.369 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d97328c8-d09e-4ace-8909-f59582083038#033[00m
Oct  4 01:43:21 np0005470441 systemd[1]: Started Virtual Machine qemu-19-instance-00000023.
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.382 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1f0170-1cd8-481c-b4c2-a49ea9614949]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.383 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd97328c8-d1 in ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.384 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd97328c8-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.384 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[37cd63d6-c36f-445c-88e4-f38f48dad63a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.386 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[99980ad5-f9cb-4b33-a030-2db40da15fd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.398 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b21d84-9b7c-4f28-8152-9fab640ed6b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.428 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cae460b2-698d-45d3-bcd0-1bf2392b975b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.462 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ca94b04d-5a7c-474e-9ee7-170184d9d326]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 NetworkManager[51690]: <info>  [1759556601.4699] manager: (tapd97328c8-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/115)
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.470 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[339177ab-3a27-4cc2-91e4-aab99a415ee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.506 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5d3d33-4b54-424e-8098-42058efc83ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.511 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[af749e38-484e-4057-a806-fdcfe3c60fe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 NetworkManager[51690]: <info>  [1759556601.5403] device (tapd97328c8-d0): carrier: link connected
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.544 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[0971f5fa-3d0e-4c4c-9113-37d94f616aa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.562 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf3da4a-c730-4bf6-9199-b4e3e6a80896]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd97328c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:1c:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449919, 'reachable_time': 28396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228512, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.580 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[db383b31-2d60-4ec0-bb82-62f1d7e9f0c5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:1cb7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 449919, 'tstamp': 449919}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228513, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.597 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[212bea49-1936-4c75-8bd7-31876b37fc7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd97328c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:1c:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449919, 'reachable_time': 28396, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228514, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.626 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1fba3c78-ba51-47ea-adc7-74bb2481f740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.684 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c51ebbdb-09f4-4ae2-9d4d-7cfd7604e42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.685 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd97328c8-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.686 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.687 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd97328c8-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:21 np0005470441 kernel: tapd97328c8-d0: entered promiscuous mode
Oct  4 01:43:21 np0005470441 NetworkManager[51690]: <info>  [1759556601.6903] manager: (tapd97328c8-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.695 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd97328c8-d0, col_values=(('external_ids', {'iface-id': 'e7fb021d-dca1-41e7-bb9e-c8037fd5783e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:21 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:21Z|00262|binding|INFO|Releasing lport e7fb021d-dca1-41e7-bb9e-c8037fd5783e from this chassis (sb_readonly=0)
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.699 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d97328c8-d09e-4ace-8909-f59582083038.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d97328c8-d09e-4ace-8909-f59582083038.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.700 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[032c4200-4bc9-4b16-a038-b10d413f3348]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.701 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-d97328c8-d09e-4ace-8909-f59582083038
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/d97328c8-d09e-4ace-8909-f59582083038.pid.haproxy
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID d97328c8-d09e-4ace-8909-f59582083038
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:43:21 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:21.701 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'env', 'PROCESS_TAG=haproxy-d97328c8-d09e-4ace-8909-f59582083038', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d97328c8-d09e-4ace-8909-f59582083038.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:43:21 np0005470441 nova_compute[192626]: 2025-10-04 05:43:21.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:22 np0005470441 podman[228553]: 2025-10-04 05:43:22.130014579 +0000 UTC m=+0.091215974 container create ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  4 01:43:22 np0005470441 podman[228553]: 2025-10-04 05:43:22.061639495 +0000 UTC m=+0.022840910 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:43:22 np0005470441 systemd[1]: Started libpod-conmon-ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c.scope.
Oct  4 01:43:22 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.200 2 DEBUG nova.network.neutron [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Updated VIF entry in instance network info cache for port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:43:22 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/101130d84ba969b09543e7ca33a1efb1e93f07acc10efbec361220170c77570b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.201 2 DEBUG nova.network.neutron [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Updating instance_info_cache with network_info: [{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.209 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556602.2089186, 55e90e51-a248-4cbd-b153-f507c41f4982 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.209 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] VM Started (Lifecycle Event)#033[00m
Oct  4 01:43:22 np0005470441 podman[228553]: 2025-10-04 05:43:22.222165658 +0000 UTC m=+0.183367153 container init ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:43:22 np0005470441 podman[228553]: 2025-10-04 05:43:22.228662652 +0000 UTC m=+0.189864077 container start ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:43:22 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [NOTICE]   (228572) : New worker (228574) forked
Oct  4 01:43:22 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [NOTICE]   (228572) : Loading success.
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.254 2 DEBUG oslo_concurrency.lockutils [req-10e84813-ceeb-4db9-ab3e-f631476b5ae6 req-c2eaffff-64b9-4b9f-8e11-26247e6b12e2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.295 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.300 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556602.2090602, 55e90e51-a248-4cbd-b153-f507c41f4982 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.301 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.323 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.327 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.371 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:43:22 np0005470441 nova_compute[192626]: 2025-10-04 05:43:22.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.033 2 DEBUG nova.compute.manager [req-3be104ca-446e-43fb-96c8-21a9b1beb37e req-346c0d5e-e9c9-41c5-809d-55babdbfd9e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.034 2 DEBUG oslo_concurrency.lockutils [req-3be104ca-446e-43fb-96c8-21a9b1beb37e req-346c0d5e-e9c9-41c5-809d-55babdbfd9e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.034 2 DEBUG oslo_concurrency.lockutils [req-3be104ca-446e-43fb-96c8-21a9b1beb37e req-346c0d5e-e9c9-41c5-809d-55babdbfd9e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.034 2 DEBUG oslo_concurrency.lockutils [req-3be104ca-446e-43fb-96c8-21a9b1beb37e req-346c0d5e-e9c9-41c5-809d-55babdbfd9e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.035 2 DEBUG nova.compute.manager [req-3be104ca-446e-43fb-96c8-21a9b1beb37e req-346c0d5e-e9c9-41c5-809d-55babdbfd9e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Processing event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.035 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.038 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556603.037855, 55e90e51-a248-4cbd-b153-f507c41f4982 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.038 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.040 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.043 2 INFO nova.virt.libvirt.driver [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Instance spawned successfully.#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.043 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.067 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.075 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.076 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.077 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.078 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.079 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.080 2 DEBUG nova.virt.libvirt.driver [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.086 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.130 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.173 2 INFO nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Took 11.12 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.173 2 DEBUG nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.258 2 INFO nova.compute.manager [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Took 11.67 seconds to build instance.#033[00m
Oct  4 01:43:23 np0005470441 nova_compute[192626]: 2025-10-04 05:43:23.281 2 DEBUG oslo_concurrency.lockutils [None req-84f2d6b8-c08d-450a-9d57-5647003c93cc b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.122 2 DEBUG nova.compute.manager [req-61423183-204d-45cb-830a-1f0f9a5fb544 req-54fb2932-4cd3-458f-a8db-68ea1a52121d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.122 2 DEBUG oslo_concurrency.lockutils [req-61423183-204d-45cb-830a-1f0f9a5fb544 req-54fb2932-4cd3-458f-a8db-68ea1a52121d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.123 2 DEBUG oslo_concurrency.lockutils [req-61423183-204d-45cb-830a-1f0f9a5fb544 req-54fb2932-4cd3-458f-a8db-68ea1a52121d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.123 2 DEBUG oslo_concurrency.lockutils [req-61423183-204d-45cb-830a-1f0f9a5fb544 req-54fb2932-4cd3-458f-a8db-68ea1a52121d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.123 2 DEBUG nova.compute.manager [req-61423183-204d-45cb-830a-1f0f9a5fb544 req-54fb2932-4cd3-458f-a8db-68ea1a52121d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] No waiting events found dispatching network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.123 2 WARNING nova.compute.manager [req-61423183-204d-45cb-830a-1f0f9a5fb544 req-54fb2932-4cd3-458f-a8db-68ea1a52121d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received unexpected event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:43:25 np0005470441 nova_compute[192626]: 2025-10-04 05:43:25.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:25 np0005470441 podman[228583]: 2025-10-04 05:43:25.318807524 +0000 UTC m=+0.056283051 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0)
Oct  4 01:43:25 np0005470441 podman[228584]: 2025-10-04 05:43:25.321562033 +0000 UTC m=+0.057760373 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:43:27 np0005470441 nova_compute[192626]: 2025-10-04 05:43:27.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:29 np0005470441 podman[228625]: 2025-10-04 05:43:29.322671628 +0000 UTC m=+0.063101214 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:43:29 np0005470441 nova_compute[192626]: 2025-10-04 05:43:29.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:29 np0005470441 NetworkManager[51690]: <info>  [1759556609.3550] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Oct  4 01:43:29 np0005470441 NetworkManager[51690]: <info>  [1759556609.3566] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Oct  4 01:43:29 np0005470441 podman[228626]: 2025-10-04 05:43:29.344746505 +0000 UTC m=+0.079726297 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct  4 01:43:29 np0005470441 nova_compute[192626]: 2025-10-04 05:43:29.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:29 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:29Z|00263|binding|INFO|Releasing lport e7fb021d-dca1-41e7-bb9e-c8037fd5783e from this chassis (sb_readonly=0)
Oct  4 01:43:29 np0005470441 nova_compute[192626]: 2025-10-04 05:43:29.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.502 2 DEBUG nova.compute.manager [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-changed-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.502 2 DEBUG nova.compute.manager [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Refreshing instance network info cache due to event network-changed-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.502 2 DEBUG oslo_concurrency.lockutils [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.502 2 DEBUG oslo_concurrency.lockutils [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.503 2 DEBUG nova.network.neutron [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Refreshing network info cache for port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.507 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.509 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.867 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.868 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.868 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.869 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.869 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.870 2 INFO nova.compute.manager [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Terminating instance#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.871 2 DEBUG nova.compute.manager [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:43:30 np0005470441 kernel: tapa1a812d2-e8 (unregistering): left promiscuous mode
Oct  4 01:43:30 np0005470441 NetworkManager[51690]: <info>  [1759556610.8996] device (tapa1a812d2-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:30Z|00264|binding|INFO|Releasing lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 from this chassis (sb_readonly=0)
Oct  4 01:43:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:30Z|00265|binding|INFO|Setting lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 down in Southbound
Oct  4 01:43:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:30Z|00266|binding|INFO|Removing iface tapa1a812d2-e8 ovn-installed in OVS
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:30 np0005470441 nova_compute[192626]: 2025-10-04 05:43:30.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.954 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:d2:b0 10.100.0.6'], port_security=['fa:16:3e:97:d2:b0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '55e90e51-a248-4cbd-b153-f507c41f4982', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d97328c8-d09e-4ace-8909-f59582083038', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'af6e9f85-2a38-4738-916c-2544b03b6a65', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f07a0568-5429-4fa1-87e0-fb13bf3492a0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.955 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 in datapath d97328c8-d09e-4ace-8909-f59582083038 unbound from our chassis#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.957 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d97328c8-d09e-4ace-8909-f59582083038, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.958 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3dc107-fa0e-41b6-b2c9-c7bc6d134248]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:30.959 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 namespace which is not needed anymore#033[00m
Oct  4 01:43:30 np0005470441 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Deactivated successfully.
Oct  4 01:43:30 np0005470441 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000023.scope: Consumed 8.779s CPU time.
Oct  4 01:43:30 np0005470441 systemd-machined[152624]: Machine qemu-19-instance-00000023 terminated.
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.138 2 INFO nova.virt.libvirt.driver [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Instance destroyed successfully.#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.139 2 DEBUG nova.objects.instance [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 55e90e51-a248-4cbd-b153-f507c41f4982 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:43:31 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [NOTICE]   (228572) : haproxy version is 2.8.14-c23fe91
Oct  4 01:43:31 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [NOTICE]   (228572) : path to executable is /usr/sbin/haproxy
Oct  4 01:43:31 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [WARNING]  (228572) : Exiting Master process...
Oct  4 01:43:31 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [WARNING]  (228572) : Exiting Master process...
Oct  4 01:43:31 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [ALERT]    (228572) : Current worker (228574) exited with code 143 (Terminated)
Oct  4 01:43:31 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228568]: [WARNING]  (228572) : All workers exited. Exiting... (0)
Oct  4 01:43:31 np0005470441 systemd[1]: libpod-ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c.scope: Deactivated successfully.
Oct  4 01:43:31 np0005470441 podman[228690]: 2025-10-04 05:43:31.150835572 +0000 UTC m=+0.088793735 container died ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:43:31 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c-userdata-shm.mount: Deactivated successfully.
Oct  4 01:43:31 np0005470441 systemd[1]: var-lib-containers-storage-overlay-101130d84ba969b09543e7ca33a1efb1e93f07acc10efbec361220170c77570b-merged.mount: Deactivated successfully.
Oct  4 01:43:31 np0005470441 podman[228690]: 2025-10-04 05:43:31.229289482 +0000 UTC m=+0.167247645 container cleanup ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  4 01:43:31 np0005470441 systemd[1]: libpod-conmon-ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c.scope: Deactivated successfully.
Oct  4 01:43:31 np0005470441 podman[228739]: 2025-10-04 05:43:31.355666224 +0000 UTC m=+0.105116969 container remove ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.363 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[eccb23aa-d828-473c-b9e3-ce3db8a70cdd]: (4, ('Sat Oct  4 05:43:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 (ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c)\nba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c\nSat Oct  4 05:43:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 (ba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c)\nba471d97ca0e3fada83ded2eb9d58a1be6fd4e57dc4c8ccdc3e14bf26619a32c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.365 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b0942b-597b-458b-943f-12c41d505ff4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.366 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd97328c8-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:31 np0005470441 kernel: tapd97328c8-d0: left promiscuous mode
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.395 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[054aadec-8db6-4ee4-a32a-1aee8821db61]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.399 2 DEBUG nova.virt.libvirt.vif [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:43:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1526462671',display_name='tempest-TestNetworkBasicOps-server-1526462671',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1526462671',id=35,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPt3qWalml7mk52pdQOZFDl2/UT2LXksJHHQB9r19ITU9pHmiVHK4ixsUJIa2RQmj4aEJey/Oe+1LgPsEgu0sKZdSH9G47lOUSajsBnCcx2gF8dubrOZiF6m7GhxuefL1g==',key_name='tempest-TestNetworkBasicOps-1776623195',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:43:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-dx9042t2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:43:23Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=55e90e51-a248-4cbd-b153-f507c41f4982,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.400 2 DEBUG nova.network.os_vif_util [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.403 2 DEBUG nova.network.os_vif_util [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.404 2 DEBUG os_vif [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.409 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1a812d2-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.417 2 INFO os_vif [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8')#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.418 2 INFO nova.virt.libvirt.driver [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Deleting instance files /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982_del#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.419 2 INFO nova.virt.libvirt.driver [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Deletion of /var/lib/nova/instances/55e90e51-a248-4cbd-b153-f507c41f4982_del complete#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.426 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[260b4cc6-b7d4-46cf-866d-d400f0f23df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.428 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cccdefea-9f60-40f0-862b-498bd052c4f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.452 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf8119a-9c22-49eb-8f29-af36b93def7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 449911, 'reachable_time': 43071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228757, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.457 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:43:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:31.457 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[a0644de6-afe9-4794-ba02-4f38647d9d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:31 np0005470441 systemd[1]: run-netns-ovnmeta\x2dd97328c8\x2dd09e\x2d4ace\x2d8909\x2df59582083038.mount: Deactivated successfully.
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.588 2 INFO nova.compute.manager [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Took 0.72 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.588 2 DEBUG oslo.service.loopingcall [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.589 2 DEBUG nova.compute.manager [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:43:31 np0005470441 nova_compute[192626]: 2025-10-04 05:43:31.589 2 DEBUG nova.network.neutron [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.323 2 DEBUG nova.network.neutron [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Updated VIF entry in instance network info cache for port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.324 2 DEBUG nova.network.neutron [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Updating instance_info_cache with network_info: [{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.376 2 DEBUG oslo_concurrency.lockutils [req-bd4f1d31-625c-461f-9837-e110bc50e6bf req-5c81bf14-935b-44ac-91f6-1e7cf8ebf608 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-55e90e51-a248-4cbd-b153-f507c41f4982" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.621 2 DEBUG nova.compute.manager [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-vif-unplugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.622 2 DEBUG oslo_concurrency.lockutils [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.623 2 DEBUG oslo_concurrency.lockutils [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.623 2 DEBUG oslo_concurrency.lockutils [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.624 2 DEBUG nova.compute.manager [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] No waiting events found dispatching network-vif-unplugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.624 2 DEBUG nova.compute.manager [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-vif-unplugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.625 2 DEBUG nova.compute.manager [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.625 2 DEBUG oslo_concurrency.lockutils [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.626 2 DEBUG oslo_concurrency.lockutils [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.626 2 DEBUG oslo_concurrency.lockutils [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.626 2 DEBUG nova.compute.manager [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] No waiting events found dispatching network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.627 2 WARNING nova.compute.manager [req-715a38e5-3d8c-454b-926b-31754cca91c8 req-be0630ca-5362-4ff4-ab9c-b6e462a9f3a0 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Received unexpected event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:43:32 np0005470441 nova_compute[192626]: 2025-10-04 05:43:32.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.492 2 DEBUG nova.network.neutron [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.548 2 INFO nova.compute.manager [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Took 1.96 seconds to deallocate network for instance.#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.712 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.712 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.776 2 DEBUG nova.compute.provider_tree [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.801 2 DEBUG nova.scheduler.client.report [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:43:33 np0005470441 nova_compute[192626]: 2025-10-04 05:43:33.970 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:34 np0005470441 nova_compute[192626]: 2025-10-04 05:43:34.085 2 INFO nova.scheduler.client.report [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 55e90e51-a248-4cbd-b153-f507c41f4982#033[00m
Oct  4 01:43:34 np0005470441 nova_compute[192626]: 2025-10-04 05:43:34.407 2 DEBUG oslo_concurrency.lockutils [None req-9dcf3a72-28c4-4a36-8958-2a86958edf09 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "55e90e51-a248-4cbd-b153-f507c41f4982" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:35.510 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:36 np0005470441 nova_compute[192626]: 2025-10-04 05:43:36.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:37 np0005470441 podman[228759]: 2025-10-04 05:43:37.318038945 +0000 UTC m=+0.067613793 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350)
Oct  4 01:43:37 np0005470441 nova_compute[192626]: 2025-10-04 05:43:37.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:41 np0005470441 nova_compute[192626]: 2025-10-04 05:43:41.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:42 np0005470441 podman[228781]: 2025-10-04 05:43:42.322865141 +0000 UTC m=+0.069969899 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:43:42 np0005470441 nova_compute[192626]: 2025-10-04 05:43:42.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.884 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.884 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.909 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.981 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.981 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.989 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:43:43 np0005470441 nova_compute[192626]: 2025-10-04 05:43:43.989 2 INFO nova.compute.claims [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.114 2 DEBUG nova.compute.provider_tree [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.129 2 DEBUG nova.scheduler.client.report [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.158 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.159 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.215 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.216 2 DEBUG nova.network.neutron [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.246 2 INFO nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.266 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:43:44 np0005470441 podman[228807]: 2025-10-04 05:43:44.334384856 +0000 UTC m=+0.079967364 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.413 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.414 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.415 2 INFO nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Creating image(s)#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.415 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.416 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.416 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.428 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.484 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.485 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.486 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.509 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.569 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.570 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.604 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.605 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.605 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.661 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.662 2 DEBUG nova.virt.disk.api [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.662 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.727 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.729 2 DEBUG nova.virt.disk.api [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.730 2 DEBUG nova.objects.instance [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f036d2b-dbc1-4468-abb6-7290fd488111 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.750 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.751 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Ensure instance console log exists: /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.751 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.752 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:44 np0005470441 nova_compute[192626]: 2025-10-04 05:43:44.752 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:45 np0005470441 nova_compute[192626]: 2025-10-04 05:43:45.976 2 DEBUG nova.policy [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:43:46 np0005470441 nova_compute[192626]: 2025-10-04 05:43:46.137 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556611.136531, 55e90e51-a248-4cbd-b153-f507c41f4982 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:46 np0005470441 nova_compute[192626]: 2025-10-04 05:43:46.138 2 INFO nova.compute.manager [-] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:43:46 np0005470441 nova_compute[192626]: 2025-10-04 05:43:46.168 2 DEBUG nova.compute.manager [None req-9add0bfd-ec4b-49aa-8ff3-cd9c39888bb0 - - - - - -] [instance: 55e90e51-a248-4cbd-b153-f507c41f4982] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:46 np0005470441 nova_compute[192626]: 2025-10-04 05:43:46.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:47 np0005470441 podman[228842]: 2025-10-04 05:43:47.405528778 +0000 UTC m=+0.137849639 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:43:47 np0005470441 nova_compute[192626]: 2025-10-04 05:43:47.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.185 2 DEBUG nova.network.neutron [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Successfully updated port: a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.205 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-1f036d2b-dbc1-4468-abb6-7290fd488111" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.205 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-1f036d2b-dbc1-4468-abb6-7290fd488111" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.206 2 DEBUG nova.network.neutron [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.278 2 DEBUG nova.compute.manager [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received event network-changed-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.279 2 DEBUG nova.compute.manager [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Refreshing instance network info cache due to event network-changed-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.280 2 DEBUG oslo_concurrency.lockutils [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1f036d2b-dbc1-4468-abb6-7290fd488111" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:43:48 np0005470441 nova_compute[192626]: 2025-10-04 05:43:48.412 2 DEBUG nova.network.neutron [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.183 2 DEBUG nova.network.neutron [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Updating instance_info_cache with network_info: [{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.224 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-1f036d2b-dbc1-4468-abb6-7290fd488111" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.225 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Instance network_info: |[{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.226 2 DEBUG oslo_concurrency.lockutils [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1f036d2b-dbc1-4468-abb6-7290fd488111" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.226 2 DEBUG nova.network.neutron [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Refreshing network info cache for port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.229 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Start _get_guest_xml network_info=[{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.234 2 WARNING nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.247 2 DEBUG nova.virt.libvirt.host [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.248 2 DEBUG nova.virt.libvirt.host [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.252 2 DEBUG nova.virt.libvirt.host [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.253 2 DEBUG nova.virt.libvirt.host [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.254 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.254 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.255 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.255 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.255 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.255 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.256 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.256 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.258 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.258 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.258 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.258 2 DEBUG nova.virt.hardware [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.261 2 DEBUG nova.virt.libvirt.vif [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-162878515',display_name='tempest-TestNetworkBasicOps-server-162878515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-162878515',id=36,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJa1n289akAmK/Cl/DSlXTMq78XGJJNlOniIBoK8uODdb8oSvTAOxC8nWEBqT87LsFESDxSuIhxYgcFgRMH3aAddCDyUt4I+FQejDAD+Y9yJxA3BedEo+4A2NGVApGlN0g==',key_name='tempest-TestNetworkBasicOps-2063064839',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-kdmh8sv2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:43:44Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=1f036d2b-dbc1-4468-abb6-7290fd488111,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.262 2 DEBUG nova.network.os_vif_util [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.262 2 DEBUG nova.network.os_vif_util [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.263 2 DEBUG nova.objects.instance [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f036d2b-dbc1-4468-abb6-7290fd488111 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.290 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <uuid>1f036d2b-dbc1-4468-abb6-7290fd488111</uuid>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <name>instance-00000024</name>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-162878515</nova:name>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:43:50</nova:creationTime>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        <nova:port uuid="a1a812d2-e8c0-4cd3-82b3-d1877f068ca7">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <entry name="serial">1f036d2b-dbc1-4468-abb6-7290fd488111</entry>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <entry name="uuid">1f036d2b-dbc1-4468-abb6-7290fd488111</entry>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.config"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:97:d2:b0"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <target dev="tapa1a812d2-e8"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/console.log" append="off"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:43:50 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:43:50 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:43:50 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:43:50 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.292 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Preparing to wait for external event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.292 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.292 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.292 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.293 2 DEBUG nova.virt.libvirt.vif [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-162878515',display_name='tempest-TestNetworkBasicOps-server-162878515',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-162878515',id=36,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJa1n289akAmK/Cl/DSlXTMq78XGJJNlOniIBoK8uODdb8oSvTAOxC8nWEBqT87LsFESDxSuIhxYgcFgRMH3aAddCDyUt4I+FQejDAD+Y9yJxA3BedEo+4A2NGVApGlN0g==',key_name='tempest-TestNetworkBasicOps-2063064839',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-kdmh8sv2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:43:44Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=1f036d2b-dbc1-4468-abb6-7290fd488111,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.293 2 DEBUG nova.network.os_vif_util [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.294 2 DEBUG nova.network.os_vif_util [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.294 2 DEBUG os_vif [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.295 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.295 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.298 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1a812d2-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.299 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa1a812d2-e8, col_values=(('external_ids', {'iface-id': 'a1a812d2-e8c0-4cd3-82b3-d1877f068ca7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:97:d2:b0', 'vm-uuid': '1f036d2b-dbc1-4468-abb6-7290fd488111'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:50 np0005470441 NetworkManager[51690]: <info>  [1759556630.3021] manager: (tapa1a812d2-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.306 2 INFO os_vif [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8')#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.371 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.372 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.372 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:97:d2:b0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:43:50 np0005470441 nova_compute[192626]: 2025-10-04 05:43:50.372 2 INFO nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Using config drive#033[00m
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.606 2 INFO nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Creating config drive at /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.config#033[00m
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.610 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1e1up5tv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.748 2 DEBUG oslo_concurrency.processutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1e1up5tv" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:43:51 np0005470441 kernel: tapa1a812d2-e8: entered promiscuous mode
Oct  4 01:43:51 np0005470441 NetworkManager[51690]: <info>  [1759556631.8211] manager: (tapa1a812d2-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:51Z|00267|binding|INFO|Claiming lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for this chassis.
Oct  4 01:43:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:51Z|00268|binding|INFO|a1a812d2-e8c0-4cd3-82b3-d1877f068ca7: Claiming fa:16:3e:97:d2:b0 10.100.0.6
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.831 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:d2:b0 10.100.0.6'], port_security=['fa:16:3e:97:d2:b0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f036d2b-dbc1-4468-abb6-7290fd488111', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d97328c8-d09e-4ace-8909-f59582083038', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'af6e9f85-2a38-4738-916c-2544b03b6a65', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f07a0568-5429-4fa1-87e0-fb13bf3492a0, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.832 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 in datapath d97328c8-d09e-4ace-8909-f59582083038 bound to our chassis#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.833 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d97328c8-d09e-4ace-8909-f59582083038#033[00m
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:51Z|00269|binding|INFO|Setting lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 ovn-installed in OVS
Oct  4 01:43:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:51Z|00270|binding|INFO|Setting lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 up in Southbound
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:51 np0005470441 nova_compute[192626]: 2025-10-04 05:43:51.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.850 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2e098b71-0a80-4392-95aa-a2b278e0fcd5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.850 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd97328c8-d1 in ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.852 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd97328c8-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.852 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4097a195-bad9-4c5a-ad7a-8c8c964d7100]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.853 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c051591c-6765-4fa6-8e34-b0f83b3a63c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 systemd-udevd[228889]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:43:51 np0005470441 systemd-machined[152624]: New machine qemu-20-instance-00000024.
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.864 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[9c54456f-7085-4319-967e-cd963d2296fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 NetworkManager[51690]: <info>  [1759556631.8742] device (tapa1a812d2-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:43:51 np0005470441 NetworkManager[51690]: <info>  [1759556631.8749] device (tapa1a812d2-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:43:51 np0005470441 systemd[1]: Started Virtual Machine qemu-20-instance-00000024.
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.890 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f7cc586f-5ecf-4ad3-8157-61d3202b8428]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.920 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[af629a35-887f-4ecd-bf10-1cd49d28165c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.927 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[51047e45-84c5-401f-8607-e50730d09216]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 NetworkManager[51690]: <info>  [1759556631.9310] manager: (tapd97328c8-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.962 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed125e5-61cd-4ee2-a72f-7f9631fdf418]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.965 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[28caba31-46ee-41a7-a867-35196e74b9c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:51 np0005470441 NetworkManager[51690]: <info>  [1759556631.9881] device (tapd97328c8-d0): carrier: link connected
Oct  4 01:43:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:51.993 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[b29d4027-ed35-467b-afbe-929eba8b8f42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.010 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfa426a-3409-44c9-b268-b8f1f1b3aa6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd97328c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:1c:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452964, 'reachable_time': 21796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228921, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.023 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3853a127-ac40-4f55-b672-bd6c90ab8ee9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe17:1cb7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 452964, 'tstamp': 452964}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228923, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.035 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b8395c19-d0ad-444d-9003-095bd2702d4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd97328c8-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:17:1c:b7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452964, 'reachable_time': 21796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228924, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.056 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6670fe1b-2ce2-4166-b444-f3af14227cfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.105 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a023a36e-fa74-48c4-9fe3-8b2c98fb8575]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.106 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd97328c8-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.106 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.107 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd97328c8-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:52 np0005470441 NetworkManager[51690]: <info>  [1759556632.1095] manager: (tapd97328c8-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Oct  4 01:43:52 np0005470441 kernel: tapd97328c8-d0: entered promiscuous mode
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.113 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd97328c8-d0, col_values=(('external_ids', {'iface-id': 'e7fb021d-dca1-41e7-bb9e-c8037fd5783e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:52 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:52Z|00271|binding|INFO|Releasing lport e7fb021d-dca1-41e7-bb9e-c8037fd5783e from this chassis (sb_readonly=0)
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.143 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d97328c8-d09e-4ace-8909-f59582083038.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d97328c8-d09e-4ace-8909-f59582083038.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.143 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[db601654-8e0c-4a0a-aa79-8429bea0171f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.144 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-d97328c8-d09e-4ace-8909-f59582083038
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/d97328c8-d09e-4ace-8909-f59582083038.pid.haproxy
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID d97328c8-d09e-4ace-8909-f59582083038
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:43:52 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:52.144 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'env', 'PROCESS_TAG=haproxy-d97328c8-d09e-4ace-8909-f59582083038', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d97328c8-d09e-4ace-8909-f59582083038.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.313 2 DEBUG nova.network.neutron [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Updated VIF entry in instance network info cache for port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.313 2 DEBUG nova.network.neutron [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Updating instance_info_cache with network_info: [{"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.339 2 DEBUG oslo_concurrency.lockutils [req-a8f63f7a-a9ff-406c-bde1-71b821b30142 req-15b59b0e-1dc9-4fb3-9301-1c7f178f3dce 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1f036d2b-dbc1-4468-abb6-7290fd488111" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:43:52 np0005470441 podman[228963]: 2025-10-04 05:43:52.604056759 +0000 UTC m=+0.074378845 container create 0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:43:52 np0005470441 systemd[1]: Started libpod-conmon-0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03.scope.
Oct  4 01:43:52 np0005470441 podman[228963]: 2025-10-04 05:43:52.571569346 +0000 UTC m=+0.041891432 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:43:52 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:43:52 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f47dbb5ae66cdd7a53b3dc0c3748ae0200259679ce3d63173c3eef9884b5274c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:43:52 np0005470441 podman[228963]: 2025-10-04 05:43:52.700061328 +0000 UTC m=+0.170383414 container init 0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  4 01:43:52 np0005470441 podman[228963]: 2025-10-04 05:43:52.705865533 +0000 UTC m=+0.176187599 container start 0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:43:52 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [NOTICE]   (228982) : New worker (228984) forked
Oct  4 01:43:52 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [NOTICE]   (228982) : Loading success.
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.823 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556632.8234563, 1f036d2b-dbc1-4468-abb6-7290fd488111 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.824 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] VM Started (Lifecycle Event)#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.863 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.868 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556632.823581, 1f036d2b-dbc1-4468-abb6-7290fd488111 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.868 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.889 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.891 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:43:52 np0005470441 nova_compute[192626]: 2025-10-04 05:43:52.910 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.546 2 DEBUG nova.compute.manager [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.547 2 DEBUG oslo_concurrency.lockutils [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.548 2 DEBUG oslo_concurrency.lockutils [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.548 2 DEBUG oslo_concurrency.lockutils [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.549 2 DEBUG nova.compute.manager [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Processing event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.549 2 DEBUG nova.compute.manager [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.550 2 DEBUG oslo_concurrency.lockutils [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.550 2 DEBUG oslo_concurrency.lockutils [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.551 2 DEBUG oslo_concurrency.lockutils [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.551 2 DEBUG nova.compute.manager [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] No waiting events found dispatching network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.552 2 WARNING nova.compute.manager [req-49587230-5de4-4916-b893-b2e5924126dd req-b39120a0-de07-4d88-95ce-062c73a99b26 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received unexpected event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for instance with vm_state building and task_state spawning.#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.553 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.561 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556633.5609627, 1f036d2b-dbc1-4468-abb6-7290fd488111 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.561 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.563 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.566 2 INFO nova.virt.libvirt.driver [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Instance spawned successfully.#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.567 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.588 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.592 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.595 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.596 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.596 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.597 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.597 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.598 2 DEBUG nova.virt.libvirt.driver [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.636 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.669 2 INFO nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Took 9.25 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.669 2 DEBUG nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.796 2 INFO nova.compute.manager [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Took 9.84 seconds to build instance.#033[00m
Oct  4 01:43:53 np0005470441 nova_compute[192626]: 2025-10-04 05:43:53.815 2 DEBUG oslo_concurrency.lockutils [None req-c2fbd3f5-6c96-4329-bdba-d2f2f2f53990 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.931s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:55Z|00272|binding|INFO|Releasing lport e7fb021d-dca1-41e7-bb9e-c8037fd5783e from this chassis (sb_readonly=0)
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.713 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.714 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.714 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.715 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.715 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.716 2 INFO nova.compute.manager [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Terminating instance#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.717 2 DEBUG nova.compute.manager [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:43:55 np0005470441 kernel: tapa1a812d2-e8 (unregistering): left promiscuous mode
Oct  4 01:43:55 np0005470441 NetworkManager[51690]: <info>  [1759556635.7404] device (tapa1a812d2-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:55Z|00273|binding|INFO|Releasing lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 from this chassis (sb_readonly=0)
Oct  4 01:43:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:55Z|00274|binding|INFO|Setting lport a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 down in Southbound
Oct  4 01:43:55 np0005470441 ovn_controller[94840]: 2025-10-04T05:43:55Z|00275|binding|INFO|Removing iface tapa1a812d2-e8 ovn-installed in OVS
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.763 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:97:d2:b0 10.100.0.6'], port_security=['fa:16:3e:97:d2:b0 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1f036d2b-dbc1-4468-abb6-7290fd488111', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d97328c8-d09e-4ace-8909-f59582083038', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1173126445', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'af6e9f85-2a38-4738-916c-2544b03b6a65', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.205', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f07a0568-5429-4fa1-87e0-fb13bf3492a0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.764 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 in datapath d97328c8-d09e-4ace-8909-f59582083038 unbound from our chassis#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.765 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d97328c8-d09e-4ace-8909-f59582083038, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.766 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[764b6493-bbce-4507-bb49-a84b1ef99145]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.766 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 namespace which is not needed anymore#033[00m
Oct  4 01:43:55 np0005470441 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000024.scope: Deactivated successfully.
Oct  4 01:43:55 np0005470441 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000024.scope: Consumed 3.050s CPU time.
Oct  4 01:43:55 np0005470441 systemd-machined[152624]: Machine qemu-20-instance-00000024 terminated.
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 podman[228993]: 2025-10-04 05:43:55.825133793 +0000 UTC m=+0.066936393 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  4 01:43:55 np0005470441 podman[228996]: 2025-10-04 05:43:55.844316938 +0000 UTC m=+0.082052083 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:43:55 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [NOTICE]   (228982) : haproxy version is 2.8.14-c23fe91
Oct  4 01:43:55 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [NOTICE]   (228982) : path to executable is /usr/sbin/haproxy
Oct  4 01:43:55 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [WARNING]  (228982) : Exiting Master process...
Oct  4 01:43:55 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [ALERT]    (228982) : Current worker (228984) exited with code 143 (Terminated)
Oct  4 01:43:55 np0005470441 neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038[228978]: [WARNING]  (228982) : All workers exited. Exiting... (0)
Oct  4 01:43:55 np0005470441 systemd[1]: libpod-0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03.scope: Deactivated successfully.
Oct  4 01:43:55 np0005470441 podman[229060]: 2025-10-04 05:43:55.891982743 +0000 UTC m=+0.041022217 container died 0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:43:55 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03-userdata-shm.mount: Deactivated successfully.
Oct  4 01:43:55 np0005470441 systemd[1]: var-lib-containers-storage-overlay-f47dbb5ae66cdd7a53b3dc0c3748ae0200259679ce3d63173c3eef9884b5274c-merged.mount: Deactivated successfully.
Oct  4 01:43:55 np0005470441 podman[229060]: 2025-10-04 05:43:55.923624953 +0000 UTC m=+0.072664427 container cleanup 0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:43:55 np0005470441 systemd[1]: libpod-conmon-0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03.scope: Deactivated successfully.
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.969 2 INFO nova.virt.libvirt.driver [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Instance destroyed successfully.#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.969 2 DEBUG nova.objects.instance [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 1f036d2b-dbc1-4468-abb6-7290fd488111 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:43:55 np0005470441 podman[229093]: 2025-10-04 05:43:55.983815543 +0000 UTC m=+0.038408142 container remove 0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.988 2 DEBUG nova.virt.libvirt.vif [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-162878515',display_name='tempest-TestNetworkBasicOps-server-162878515',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-162878515',id=36,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJa1n289akAmK/Cl/DSlXTMq78XGJJNlOniIBoK8uODdb8oSvTAOxC8nWEBqT87LsFESDxSuIhxYgcFgRMH3aAddCDyUt4I+FQejDAD+Y9yJxA3BedEo+4A2NGVApGlN0g==',key_name='tempest-TestNetworkBasicOps-2063064839',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:43:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-kdmh8sv2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:43:53Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=1f036d2b-dbc1-4468-abb6-7290fd488111,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.988 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c2362500-bad3-4779-8ce3-815e69526bbf]: (4, ('Sat Oct  4 05:43:55 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 (0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03)\n0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03\nSat Oct  4 05:43:55 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 (0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03)\n0f687e47c692308aa7cdd66c3aa0d1e83bc0bf64ec6dd67b5d11e2e8957f9f03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.989 2 DEBUG nova.network.os_vif_util [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "address": "fa:16:3e:97:d2:b0", "network": {"id": "d97328c8-d09e-4ace-8909-f59582083038", "bridge": "br-int", "label": "tempest-network-smoke--40461002", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1a812d2-e8", "ovs_interfaceid": "a1a812d2-e8c0-4cd3-82b3-d1877f068ca7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.990 2 DEBUG nova.network.os_vif_util [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.990 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b847b399-265b-4eb4-966a-246521d9a5ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.990 2 DEBUG os_vif [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:43:55 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:55.990 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd97328c8-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.991 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1a812d2-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:43:55 np0005470441 kernel: tapd97328c8-d0: left promiscuous mode
Oct  4 01:43:55 np0005470441 nova_compute[192626]: 2025-10-04 05:43:55.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.008 2 INFO os_vif [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:97:d2:b0,bridge_name='br-int',has_traffic_filtering=True,id=a1a812d2-e8c0-4cd3-82b3-d1877f068ca7,network=Network(d97328c8-d09e-4ace-8909-f59582083038),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa1a812d2-e8')#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.009 2 INFO nova.virt.libvirt.driver [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Deleting instance files /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111_del#033[00m
Oct  4 01:43:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:56.009 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8c4a0cbd-7f1a-4ac5-922a-4d121e8d88f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.010 2 INFO nova.virt.libvirt.driver [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Deletion of /var/lib/nova/instances/1f036d2b-dbc1-4468-abb6-7290fd488111_del complete#033[00m
Oct  4 01:43:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:56.033 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[520e984d-089b-455b-b9ef-ef814a0aabf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:56.034 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ba903ccf-69c1-4d62-b111-836582cde67b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.042 2 DEBUG nova.compute.manager [req-84df09d8-1ef4-4a3e-b6d8-2c6c56774d59 req-0da7fa96-89d6-4c21-b9d2-3d5fe9e99b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received event network-vif-unplugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.042 2 DEBUG oslo_concurrency.lockutils [req-84df09d8-1ef4-4a3e-b6d8-2c6c56774d59 req-0da7fa96-89d6-4c21-b9d2-3d5fe9e99b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.043 2 DEBUG oslo_concurrency.lockutils [req-84df09d8-1ef4-4a3e-b6d8-2c6c56774d59 req-0da7fa96-89d6-4c21-b9d2-3d5fe9e99b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.043 2 DEBUG oslo_concurrency.lockutils [req-84df09d8-1ef4-4a3e-b6d8-2c6c56774d59 req-0da7fa96-89d6-4c21-b9d2-3d5fe9e99b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.043 2 DEBUG nova.compute.manager [req-84df09d8-1ef4-4a3e-b6d8-2c6c56774d59 req-0da7fa96-89d6-4c21-b9d2-3d5fe9e99b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] No waiting events found dispatching network-vif-unplugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.044 2 DEBUG nova.compute.manager [req-84df09d8-1ef4-4a3e-b6d8-2c6c56774d59 req-0da7fa96-89d6-4c21-b9d2-3d5fe9e99b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received event network-vif-unplugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:43:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:56.046 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f2988fcb-a0c5-425a-b428-b71144929c07]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 452956, 'reachable_time': 30548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229122, 'error': None, 'target': 'ovnmeta-d97328c8-d09e-4ace-8909-f59582083038', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:56.047 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d97328c8-d09e-4ace-8909-f59582083038 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:43:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:43:56.048 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[95a3dc45-a0fd-41fa-be60-4e08168fc43f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:43:56 np0005470441 systemd[1]: run-netns-ovnmeta\x2dd97328c8\x2dd09e\x2d4ace\x2d8909\x2df59582083038.mount: Deactivated successfully.
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.085 2 INFO nova.compute.manager [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.086 2 DEBUG oslo.service.loopingcall [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.086 2 DEBUG nova.compute.manager [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:43:56 np0005470441 nova_compute[192626]: 2025-10-04 05:43:56.086 2 DEBUG nova.network.neutron [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.280 2 DEBUG nova.network.neutron [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.303 2 INFO nova.compute.manager [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Took 1.22 seconds to deallocate network for instance.#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.361 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.362 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.434 2 DEBUG nova.compute.provider_tree [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.454 2 DEBUG nova.scheduler.client.report [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.477 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.499 2 INFO nova.scheduler.client.report [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 1f036d2b-dbc1-4468-abb6-7290fd488111#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.575 2 DEBUG oslo_concurrency.lockutils [None req-f4a0d025-7885-473d-b100-ac2b15b7010f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:57 np0005470441 nova_compute[192626]: 2025-10-04 05:43:57.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:43:58 np0005470441 nova_compute[192626]: 2025-10-04 05:43:58.217 2 DEBUG nova.compute.manager [req-5eb273c2-c740-4493-8889-becae67bc9f2 req-37108f9c-1e79-4452-b17c-6c9861a044fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:43:58 np0005470441 nova_compute[192626]: 2025-10-04 05:43:58.218 2 DEBUG oslo_concurrency.lockutils [req-5eb273c2-c740-4493-8889-becae67bc9f2 req-37108f9c-1e79-4452-b17c-6c9861a044fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:43:58 np0005470441 nova_compute[192626]: 2025-10-04 05:43:58.218 2 DEBUG oslo_concurrency.lockutils [req-5eb273c2-c740-4493-8889-becae67bc9f2 req-37108f9c-1e79-4452-b17c-6c9861a044fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:43:58 np0005470441 nova_compute[192626]: 2025-10-04 05:43:58.219 2 DEBUG oslo_concurrency.lockutils [req-5eb273c2-c740-4493-8889-becae67bc9f2 req-37108f9c-1e79-4452-b17c-6c9861a044fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1f036d2b-dbc1-4468-abb6-7290fd488111-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:43:58 np0005470441 nova_compute[192626]: 2025-10-04 05:43:58.219 2 DEBUG nova.compute.manager [req-5eb273c2-c740-4493-8889-becae67bc9f2 req-37108f9c-1e79-4452-b17c-6c9861a044fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] No waiting events found dispatching network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:43:58 np0005470441 nova_compute[192626]: 2025-10-04 05:43:58.219 2 WARNING nova.compute.manager [req-5eb273c2-c740-4493-8889-becae67bc9f2 req-37108f9c-1e79-4452-b17c-6c9861a044fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Received unexpected event network-vif-plugged-a1a812d2-e8c0-4cd3-82b3-d1877f068ca7 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:44:00 np0005470441 podman[229124]: 2025-10-04 05:44:00.32306245 +0000 UTC m=+0.067004196 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:44:00 np0005470441 podman[229123]: 2025-10-04 05:44:00.346230887 +0000 UTC m=+0.095917666 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:44:00 np0005470441 nova_compute[192626]: 2025-10-04 05:44:00.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.710 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:44:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:44:02 np0005470441 nova_compute[192626]: 2025-10-04 05:44:02.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:02 np0005470441 nova_compute[192626]: 2025-10-04 05:44:02.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:03 np0005470441 nova_compute[192626]: 2025-10-04 05:44:03.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:04 np0005470441 nova_compute[192626]: 2025-10-04 05:44:04.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:04 np0005470441 nova_compute[192626]: 2025-10-04 05:44:04.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:04 np0005470441 nova_compute[192626]: 2025-10-04 05:44:04.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:44:05 np0005470441 nova_compute[192626]: 2025-10-04 05:44:05.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:06.752 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:06.753 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:06.753 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:06 np0005470441 nova_compute[192626]: 2025-10-04 05:44:06.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.737 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.737 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.764 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.765 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.765 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.765 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:07 np0005470441 podman[229165]: 2025-10-04 05:44:07.880082035 +0000 UTC m=+0.068259251 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.970 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.971 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5751MB free_disk=73.42867660522461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.972 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:07 np0005470441 nova_compute[192626]: 2025-10-04 05:44:07.972 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:08 np0005470441 nova_compute[192626]: 2025-10-04 05:44:08.040 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:44:08 np0005470441 nova_compute[192626]: 2025-10-04 05:44:08.041 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:44:08 np0005470441 nova_compute[192626]: 2025-10-04 05:44:08.073 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:44:08 np0005470441 nova_compute[192626]: 2025-10-04 05:44:08.095 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:44:08 np0005470441 nova_compute[192626]: 2025-10-04 05:44:08.124 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:44:08 np0005470441 nova_compute[192626]: 2025-10-04 05:44:08.125 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:09 np0005470441 nova_compute[192626]: 2025-10-04 05:44:09.103 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:10 np0005470441 nova_compute[192626]: 2025-10-04 05:44:10.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:10 np0005470441 nova_compute[192626]: 2025-10-04 05:44:10.968 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556635.9674091, 1f036d2b-dbc1-4468-abb6-7290fd488111 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:10 np0005470441 nova_compute[192626]: 2025-10-04 05:44:10.969 2 INFO nova.compute.manager [-] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:44:10 np0005470441 nova_compute[192626]: 2025-10-04 05:44:10.989 2 DEBUG nova.compute.manager [None req-5e701bf5-abb6-4d94-83a1-531ec6fefe6f - - - - - -] [instance: 1f036d2b-dbc1-4468-abb6-7290fd488111] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:11 np0005470441 nova_compute[192626]: 2025-10-04 05:44:11.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:12 np0005470441 nova_compute[192626]: 2025-10-04 05:44:12.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:13 np0005470441 podman[229186]: 2025-10-04 05:44:13.316177559 +0000 UTC m=+0.057502105 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:44:14 np0005470441 nova_compute[192626]: 2025-10-04 05:44:14.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:15 np0005470441 podman[229210]: 2025-10-04 05:44:15.326148809 +0000 UTC m=+0.069528437 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  4 01:44:16 np0005470441 nova_compute[192626]: 2025-10-04 05:44:16.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:16 np0005470441 nova_compute[192626]: 2025-10-04 05:44:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.396 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.396 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.416 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.497 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.498 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.505 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.505 2 INFO nova.compute.claims [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.620 2 DEBUG nova.compute.provider_tree [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.640 2 DEBUG nova.scheduler.client.report [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.686 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.687 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.757 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.758 2 DEBUG nova.network.neutron [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.794 2 INFO nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.818 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.947 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.949 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.950 2 INFO nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Creating image(s)#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.952 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "/var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.953 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.954 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "/var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:17 np0005470441 nova_compute[192626]: 2025-10-04 05:44:17.980 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.059 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.060 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.061 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.071 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.131 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.132 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.174 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.176 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.176 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.237 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.238 2 DEBUG nova.virt.disk.api [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Checking if we can resize image /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.239 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.327 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.328 2 DEBUG nova.virt.disk.api [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Cannot resize image /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.328 2 DEBUG nova.objects.instance [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'migration_context' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.345 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.345 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Ensure instance console log exists: /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.346 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.346 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.346 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:18 np0005470441 podman[229240]: 2025-10-04 05:44:18.390336273 +0000 UTC m=+0.137328285 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:44:18 np0005470441 nova_compute[192626]: 2025-10-04 05:44:18.453 2 DEBUG nova.policy [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd65c768451494a3f9e4f9a238fa5c40d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0c087ea0f62444e80490916b42c760f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:44:19 np0005470441 nova_compute[192626]: 2025-10-04 05:44:19.865 2 DEBUG nova.network.neutron [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Successfully created port: 7a355384-4990-4aec-b3c5-fc480e9abef3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.321 2 DEBUG nova.network.neutron [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Successfully updated port: 7a355384-4990-4aec-b3c5-fc480e9abef3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.340 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.341 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquired lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.341 2 DEBUG nova.network.neutron [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.432 2 DEBUG nova.compute.manager [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-changed-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.433 2 DEBUG nova.compute.manager [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Refreshing instance network info cache due to event network-changed-7a355384-4990-4aec-b3c5-fc480e9abef3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.433 2 DEBUG oslo_concurrency.lockutils [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:44:21 np0005470441 nova_compute[192626]: 2025-10-04 05:44:21.631 2 DEBUG nova.network.neutron [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.506 2 DEBUG nova.network.neutron [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updating instance_info_cache with network_info: [{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.534 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Releasing lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.534 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance network_info: |[{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.535 2 DEBUG oslo_concurrency.lockutils [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.535 2 DEBUG nova.network.neutron [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Refreshing network info cache for port 7a355384-4990-4aec-b3c5-fc480e9abef3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.538 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Start _get_guest_xml network_info=[{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.543 2 WARNING nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.548 2 DEBUG nova.virt.libvirt.host [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.548 2 DEBUG nova.virt.libvirt.host [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.553 2 DEBUG nova.virt.libvirt.host [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.554 2 DEBUG nova.virt.libvirt.host [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.555 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.555 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.555 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.556 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.556 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.556 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.556 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.556 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.557 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.557 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.557 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.557 2 DEBUG nova.virt.hardware [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.560 2 DEBUG nova.virt.libvirt.vif [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:44:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1596820660',display_name='tempest-TestNetworkAdvancedServerOps-server-1596820660',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1596820660',id=37,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC4wQ/X6/0SAB6Rc5KlTH9f3pyPVWOiQC4NvXftXMCtQ+LWE8QAb+Sd0w1/aKMwLu1E1D7bdBhUPrAOFVwtwvhneZiVX+1Wmdo4fjjO2BAL4vIvChQgt0wvW5z5SoXjMyQ==',key_name='tempest-TestNetworkAdvancedServerOps-1579178700',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-9ix08ofh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:44:17Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=236c13b8-294d-472b-81b3-f3c6635a12ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.560 2 DEBUG nova.network.os_vif_util [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.561 2 DEBUG nova.network.os_vif_util [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.562 2 DEBUG nova.objects.instance [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_devices' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.578 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <uuid>236c13b8-294d-472b-81b3-f3c6635a12ac</uuid>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <name>instance-00000025</name>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1596820660</nova:name>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:44:22</nova:creationTime>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:user uuid="d65c768451494a3f9e4f9a238fa5c40d">tempest-TestNetworkAdvancedServerOps-1635331179-project-member</nova:user>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:project uuid="d0c087ea0f62444e80490916b42c760f">tempest-TestNetworkAdvancedServerOps-1635331179</nova:project>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        <nova:port uuid="7a355384-4990-4aec-b3c5-fc480e9abef3">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <entry name="serial">236c13b8-294d-472b-81b3-f3c6635a12ac</entry>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <entry name="uuid">236c13b8-294d-472b-81b3-f3c6635a12ac</entry>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.config"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:26:71:6d"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <target dev="tap7a355384-49"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/console.log" append="off"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:44:22 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:44:22 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:44:22 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:44:22 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.579 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Preparing to wait for external event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.579 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.580 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.580 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.580 2 DEBUG nova.virt.libvirt.vif [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:44:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1596820660',display_name='tempest-TestNetworkAdvancedServerOps-server-1596820660',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1596820660',id=37,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC4wQ/X6/0SAB6Rc5KlTH9f3pyPVWOiQC4NvXftXMCtQ+LWE8QAb+Sd0w1/aKMwLu1E1D7bdBhUPrAOFVwtwvhneZiVX+1Wmdo4fjjO2BAL4vIvChQgt0wvW5z5SoXjMyQ==',key_name='tempest-TestNetworkAdvancedServerOps-1579178700',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-9ix08ofh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:44:17Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=236c13b8-294d-472b-81b3-f3c6635a12ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.581 2 DEBUG nova.network.os_vif_util [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.581 2 DEBUG nova.network.os_vif_util [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.582 2 DEBUG os_vif [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.582 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.583 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a355384-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.586 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a355384-49, col_values=(('external_ids', {'iface-id': '7a355384-4990-4aec-b3c5-fc480e9abef3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:71:6d', 'vm-uuid': '236c13b8-294d-472b-81b3-f3c6635a12ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:22 np0005470441 NetworkManager[51690]: <info>  [1759556662.5893] manager: (tap7a355384-49): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.594 2 INFO os_vif [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49')#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.647 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.647 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.647 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] No VIF found with MAC fa:16:3e:26:71:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.648 2 INFO nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Using config drive#033[00m
Oct  4 01:44:22 np0005470441 nova_compute[192626]: 2025-10-04 05:44:22.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.111 2 INFO nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Creating config drive at /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.config#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.117 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpetpvt0sz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.247 2 DEBUG oslo_concurrency.processutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpetpvt0sz" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:44:23 np0005470441 kernel: tap7a355384-49: entered promiscuous mode
Oct  4 01:44:23 np0005470441 NetworkManager[51690]: <info>  [1759556663.3087] manager: (tap7a355384-49): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Oct  4 01:44:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:23Z|00276|binding|INFO|Claiming lport 7a355384-4990-4aec-b3c5-fc480e9abef3 for this chassis.
Oct  4 01:44:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:23Z|00277|binding|INFO|7a355384-4990-4aec-b3c5-fc480e9abef3: Claiming fa:16:3e:26:71:6d 10.100.0.5
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.338 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:71:6d 10.100.0.5'], port_security=['fa:16:3e:26:71:6d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '236c13b8-294d-472b-81b3-f3c6635a12ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0dc79b17-6ee0-463e-830b-813165dac34e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=472c7ec5-1ea1-42f8-998b-9c7bafb7774e, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=7a355384-4990-4aec-b3c5-fc480e9abef3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.340 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 7a355384-4990-4aec-b3c5-fc480e9abef3 in datapath 090b373d-8a67-47ea-adb2-2b53df5e63e1 bound to our chassis#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.343 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 090b373d-8a67-47ea-adb2-2b53df5e63e1#033[00m
Oct  4 01:44:23 np0005470441 systemd-udevd[229290]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:44:23 np0005470441 systemd-machined[152624]: New machine qemu-21-instance-00000025.
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.361 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1dd47f-7c81-4703-890d-3b21dc65c2e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.362 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap090b373d-81 in ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:44:23 np0005470441 NetworkManager[51690]: <info>  [1759556663.3687] device (tap7a355384-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.368 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap090b373d-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.368 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[93cdbc7b-65cf-4806-a521-f02b6a6bc4eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 NetworkManager[51690]: <info>  [1759556663.3703] device (tap7a355384-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.369 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7999a140-d783-4bd4-935e-baffa2a11f7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.388 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[4ddec462-2a1c-4545-ac2b-836a62b25eba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:23Z|00278|binding|INFO|Setting lport 7a355384-4990-4aec-b3c5-fc480e9abef3 ovn-installed in OVS
Oct  4 01:44:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:23Z|00279|binding|INFO|Setting lport 7a355384-4990-4aec-b3c5-fc480e9abef3 up in Southbound
Oct  4 01:44:23 np0005470441 systemd[1]: Started Virtual Machine qemu-21-instance-00000025.
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.406 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[637ade85-203a-4a3e-9158-e0096701a388]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.433 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[58e422d1-9e86-4f9d-a68c-8b7419625c60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.439 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aca7e6dc-140c-4650-a800-7a19b21785ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 systemd-udevd[229293]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:44:23 np0005470441 NetworkManager[51690]: <info>  [1759556663.4404] manager: (tap090b373d-80): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.475 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8b90e6-f80c-483c-811c-49b80edb693b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.479 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a734b8b8-1fc8-4dce-9a62-71861facb407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 NetworkManager[51690]: <info>  [1759556663.5002] device (tap090b373d-80): carrier: link connected
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.506 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[535fee94-a771-406d-8fd2-4cf6a2b9b1fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.521 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ffa655ca-8b05-43d1-bc92-761c24da0eb7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap090b373d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:c9:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456115, 'reachable_time': 44778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229323, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.539 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[38a8755a-0dd0-4bc6-ba4b-a69400c0b56a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:c969'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 456115, 'tstamp': 456115}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229325, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.557 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[65acfd5d-875a-4547-9755-baf66f8d035e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap090b373d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:c9:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456115, 'reachable_time': 44778, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229330, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.586 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c598e52c-8109-42a6-a6b6-b9d08e654409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.641 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[051e938c-cdcc-4f63-9980-b9b076d3fd41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.642 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap090b373d-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.642 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.643 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap090b373d-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 kernel: tap090b373d-80: entered promiscuous mode
Oct  4 01:44:23 np0005470441 NetworkManager[51690]: <info>  [1759556663.6452] manager: (tap090b373d-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.647 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap090b373d-80, col_values=(('external_ids', {'iface-id': '66e99394-913f-4e61-b79f-68494922e9dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:23Z|00280|binding|INFO|Releasing lport 66e99394-913f-4e61-b79f-68494922e9dc from this chassis (sb_readonly=0)
Oct  4 01:44:23 np0005470441 nova_compute[192626]: 2025-10-04 05:44:23.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.659 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/090b373d-8a67-47ea-adb2-2b53df5e63e1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/090b373d-8a67-47ea-adb2-2b53df5e63e1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.660 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[acf844d1-bf30-4cd5-9c55-1b0266b3a02e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.661 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-090b373d-8a67-47ea-adb2-2b53df5e63e1
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/090b373d-8a67-47ea-adb2-2b53df5e63e1.pid.haproxy
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 090b373d-8a67-47ea-adb2-2b53df5e63e1
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:44:23 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:23.661 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'env', 'PROCESS_TAG=haproxy-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/090b373d-8a67-47ea-adb2-2b53df5e63e1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.017 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556664.0164487, 236c13b8-294d-472b-81b3-f3c6635a12ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.018 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Started (Lifecycle Event)#033[00m
Oct  4 01:44:24 np0005470441 podman[229364]: 2025-10-04 05:44:24.044986298 +0000 UTC m=+0.063130265 container create 0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  4 01:44:24 np0005470441 systemd[1]: Started libpod-conmon-0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122.scope.
Oct  4 01:44:24 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:44:24 np0005470441 podman[229364]: 2025-10-04 05:44:24.009202801 +0000 UTC m=+0.027346808 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:44:24 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6273f02fc78493031a77722c8e3b09d92053d7a533d6a80c198e9ea9f6c03b2c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:44:24 np0005470441 podman[229364]: 2025-10-04 05:44:24.126245488 +0000 UTC m=+0.144389475 container init 0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:44:24 np0005470441 podman[229364]: 2025-10-04 05:44:24.138077834 +0000 UTC m=+0.156221811 container start 0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:44:24 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [NOTICE]   (229383) : New worker (229385) forked
Oct  4 01:44:24 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [NOTICE]   (229383) : Loading success.
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.507 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.511 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556664.0176516, 236c13b8-294d-472b-81b3-f3c6635a12ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.511 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.541 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.544 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.562 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.897 2 DEBUG nova.network.neutron [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updated VIF entry in instance network info cache for port 7a355384-4990-4aec-b3c5-fc480e9abef3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.898 2 DEBUG nova.network.neutron [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updating instance_info_cache with network_info: [{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:44:24 np0005470441 nova_compute[192626]: 2025-10-04 05:44:24.916 2 DEBUG oslo_concurrency.lockutils [req-3ca9dc87-f0de-46e7-86d8-517d523a86ca req-dd78c5f1-40ed-4963-b7fe-f72e2434caa8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:44:26 np0005470441 podman[229394]: 2025-10-04 05:44:26.313399644 +0000 UTC m=+0.065171444 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct  4 01:44:26 np0005470441 podman[229395]: 2025-10-04 05:44:26.329009687 +0000 UTC m=+0.071226575 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.757 2 DEBUG nova.compute.manager [req-ba51a322-cdfb-415f-b269-0505fd1c7ef8 req-fc72fe0b-2280-449a-82db-e7fd51b93bfc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.757 2 DEBUG oslo_concurrency.lockutils [req-ba51a322-cdfb-415f-b269-0505fd1c7ef8 req-fc72fe0b-2280-449a-82db-e7fd51b93bfc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.757 2 DEBUG oslo_concurrency.lockutils [req-ba51a322-cdfb-415f-b269-0505fd1c7ef8 req-fc72fe0b-2280-449a-82db-e7fd51b93bfc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.758 2 DEBUG oslo_concurrency.lockutils [req-ba51a322-cdfb-415f-b269-0505fd1c7ef8 req-fc72fe0b-2280-449a-82db-e7fd51b93bfc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.758 2 DEBUG nova.compute.manager [req-ba51a322-cdfb-415f-b269-0505fd1c7ef8 req-fc72fe0b-2280-449a-82db-e7fd51b93bfc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Processing event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.759 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.763 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556666.7629304, 236c13b8-294d-472b-81b3-f3c6635a12ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.764 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.768 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.772 2 INFO nova.virt.libvirt.driver [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance spawned successfully.#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.773 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.797 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.803 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.805 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.806 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.806 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.806 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.807 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.807 2 DEBUG nova.virt.libvirt.driver [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.842 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.898 2 INFO nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Took 8.95 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:44:26 np0005470441 nova_compute[192626]: 2025-10-04 05:44:26.898 2 DEBUG nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:27 np0005470441 nova_compute[192626]: 2025-10-04 05:44:27.003 2 INFO nova.compute.manager [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Took 9.53 seconds to build instance.#033[00m
Oct  4 01:44:27 np0005470441 nova_compute[192626]: 2025-10-04 05:44:27.030 2 DEBUG oslo_concurrency.lockutils [None req-e8863c23-12a3-453f-8b03-a2b9a6de4f91 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:27 np0005470441 nova_compute[192626]: 2025-10-04 05:44:27.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:27 np0005470441 nova_compute[192626]: 2025-10-04 05:44:27.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:28 np0005470441 nova_compute[192626]: 2025-10-04 05:44:28.868 2 DEBUG nova.compute.manager [req-990f27df-c3de-4eb8-90a3-0025e1f717ae req-a1ac8774-c28e-44db-8828-e4fa7e2151fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:28 np0005470441 nova_compute[192626]: 2025-10-04 05:44:28.869 2 DEBUG oslo_concurrency.lockutils [req-990f27df-c3de-4eb8-90a3-0025e1f717ae req-a1ac8774-c28e-44db-8828-e4fa7e2151fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:28 np0005470441 nova_compute[192626]: 2025-10-04 05:44:28.869 2 DEBUG oslo_concurrency.lockutils [req-990f27df-c3de-4eb8-90a3-0025e1f717ae req-a1ac8774-c28e-44db-8828-e4fa7e2151fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:28 np0005470441 nova_compute[192626]: 2025-10-04 05:44:28.870 2 DEBUG oslo_concurrency.lockutils [req-990f27df-c3de-4eb8-90a3-0025e1f717ae req-a1ac8774-c28e-44db-8828-e4fa7e2151fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:28 np0005470441 nova_compute[192626]: 2025-10-04 05:44:28.870 2 DEBUG nova.compute.manager [req-990f27df-c3de-4eb8-90a3-0025e1f717ae req-a1ac8774-c28e-44db-8828-e4fa7e2151fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] No waiting events found dispatching network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:44:28 np0005470441 nova_compute[192626]: 2025-10-04 05:44:28.870 2 WARNING nova.compute.manager [req-990f27df-c3de-4eb8-90a3-0025e1f717ae req-a1ac8774-c28e-44db-8828-e4fa7e2151fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received unexpected event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:44:30 np0005470441 NetworkManager[51690]: <info>  [1759556670.3723] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Oct  4 01:44:30 np0005470441 NetworkManager[51690]: <info>  [1759556670.3743] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Oct  4 01:44:30 np0005470441 nova_compute[192626]: 2025-10-04 05:44:30.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:30 np0005470441 nova_compute[192626]: 2025-10-04 05:44:30.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:30Z|00281|binding|INFO|Releasing lport 66e99394-913f-4e61-b79f-68494922e9dc from this chassis (sb_readonly=0)
Oct  4 01:44:30 np0005470441 nova_compute[192626]: 2025-10-04 05:44:30.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:31 np0005470441 podman[229439]: 2025-10-04 05:44:31.318184567 +0000 UTC m=+0.066781429 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:44:31 np0005470441 podman[229440]: 2025-10-04 05:44:31.332832154 +0000 UTC m=+0.075741604 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:44:31 np0005470441 nova_compute[192626]: 2025-10-04 05:44:31.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:31.336 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:44:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:31.338 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:44:31 np0005470441 nova_compute[192626]: 2025-10-04 05:44:31.424 2 DEBUG nova.compute.manager [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-changed-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:31 np0005470441 nova_compute[192626]: 2025-10-04 05:44:31.424 2 DEBUG nova.compute.manager [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Refreshing instance network info cache due to event network-changed-7a355384-4990-4aec-b3c5-fc480e9abef3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:44:31 np0005470441 nova_compute[192626]: 2025-10-04 05:44:31.425 2 DEBUG oslo_concurrency.lockutils [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:44:31 np0005470441 nova_compute[192626]: 2025-10-04 05:44:31.425 2 DEBUG oslo_concurrency.lockutils [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:44:31 np0005470441 nova_compute[192626]: 2025-10-04 05:44:31.425 2 DEBUG nova.network.neutron [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Refreshing network info cache for port 7a355384-4990-4aec-b3c5-fc480e9abef3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:44:32 np0005470441 nova_compute[192626]: 2025-10-04 05:44:32.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:32 np0005470441 nova_compute[192626]: 2025-10-04 05:44:32.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:34 np0005470441 nova_compute[192626]: 2025-10-04 05:44:34.994 2 DEBUG nova.network.neutron [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updated VIF entry in instance network info cache for port 7a355384-4990-4aec-b3c5-fc480e9abef3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:44:34 np0005470441 nova_compute[192626]: 2025-10-04 05:44:34.995 2 DEBUG nova.network.neutron [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updating instance_info_cache with network_info: [{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:44:35 np0005470441 nova_compute[192626]: 2025-10-04 05:44:35.022 2 DEBUG oslo_concurrency.lockutils [req-e4418db2-7274-40a4-956e-a41ee5877c29 req-6ccd975e-3fbf-4569-a0f9-28722e31718a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:44:37 np0005470441 nova_compute[192626]: 2025-10-04 05:44:37.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:37 np0005470441 nova_compute[192626]: 2025-10-04 05:44:37.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:38 np0005470441 podman[229483]: 2025-10-04 05:44:38.340910698 +0000 UTC m=+0.084770470 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  4 01:44:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:39.341 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:40Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:71:6d 10.100.0.5
Oct  4 01:44:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:40Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:71:6d 10.100.0.5
Oct  4 01:44:42 np0005470441 nova_compute[192626]: 2025-10-04 05:44:42.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:42 np0005470441 nova_compute[192626]: 2025-10-04 05:44:42.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:44 np0005470441 podman[229512]: 2025-10-04 05:44:44.322034191 +0000 UTC m=+0.076945158 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:44:46 np0005470441 podman[229538]: 2025-10-04 05:44:46.325472506 +0000 UTC m=+0.070929947 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.379 2 INFO nova.compute.manager [None req-debe88c4-d471-4d5f-af13-b3f48bf37eda d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Get console output#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.386 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.708 2 DEBUG nova.objects.instance [None req-3e6d54bb-6e49-411b-9ac8-9892f8e58a42 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_devices' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.733 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556686.7335784, 236c13b8-294d-472b-81b3-f3c6635a12ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.734 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.758 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.763 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:44:46 np0005470441 nova_compute[192626]: 2025-10-04 05:44:46.794 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  4 01:44:47 np0005470441 kernel: tap7a355384-49 (unregistering): left promiscuous mode
Oct  4 01:44:47 np0005470441 NetworkManager[51690]: <info>  [1759556687.4674] device (tap7a355384-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:44:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:47Z|00282|binding|INFO|Releasing lport 7a355384-4990-4aec-b3c5-fc480e9abef3 from this chassis (sb_readonly=0)
Oct  4 01:44:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:47Z|00283|binding|INFO|Setting lport 7a355384-4990-4aec-b3c5-fc480e9abef3 down in Southbound
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:47Z|00284|binding|INFO|Removing iface tap7a355384-49 ovn-installed in OVS
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.487 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:71:6d 10.100.0.5'], port_security=['fa:16:3e:26:71:6d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '236c13b8-294d-472b-81b3-f3c6635a12ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0dc79b17-6ee0-463e-830b-813165dac34e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=472c7ec5-1ea1-42f8-998b-9c7bafb7774e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=7a355384-4990-4aec-b3c5-fc480e9abef3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.488 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 7a355384-4990-4aec-b3c5-fc480e9abef3 in datapath 090b373d-8a67-47ea-adb2-2b53df5e63e1 unbound from our chassis#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.489 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 090b373d-8a67-47ea-adb2-2b53df5e63e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.490 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2e674f25-a150-4feb-95dc-07a90c6a9a0a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.491 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 namespace which is not needed anymore#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct  4 01:44:47 np0005470441 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d00000025.scope: Consumed 13.206s CPU time.
Oct  4 01:44:47 np0005470441 systemd-machined[152624]: Machine qemu-21-instance-00000025 terminated.
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [NOTICE]   (229383) : haproxy version is 2.8.14-c23fe91
Oct  4 01:44:47 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [NOTICE]   (229383) : path to executable is /usr/sbin/haproxy
Oct  4 01:44:47 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [WARNING]  (229383) : Exiting Master process...
Oct  4 01:44:47 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [ALERT]    (229383) : Current worker (229385) exited with code 143 (Terminated)
Oct  4 01:44:47 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229379]: [WARNING]  (229383) : All workers exited. Exiting... (0)
Oct  4 01:44:47 np0005470441 systemd[1]: libpod-0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122.scope: Deactivated successfully.
Oct  4 01:44:47 np0005470441 podman[229586]: 2025-10-04 05:44:47.618376494 +0000 UTC m=+0.044537967 container died 0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:44:47 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122-userdata-shm.mount: Deactivated successfully.
Oct  4 01:44:47 np0005470441 systemd[1]: var-lib-containers-storage-overlay-6273f02fc78493031a77722c8e3b09d92053d7a533d6a80c198e9ea9f6c03b2c-merged.mount: Deactivated successfully.
Oct  4 01:44:47 np0005470441 podman[229586]: 2025-10-04 05:44:47.655075367 +0000 UTC m=+0.081236850 container cleanup 0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.658 2 DEBUG nova.compute.manager [req-c9a24d7b-2039-41b9-b021-5f0874b509b7 req-ad72e9a0-97e2-4e7e-8aec-784bb8d667db 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-unplugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.659 2 DEBUG oslo_concurrency.lockutils [req-c9a24d7b-2039-41b9-b021-5f0874b509b7 req-ad72e9a0-97e2-4e7e-8aec-784bb8d667db 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.659 2 DEBUG oslo_concurrency.lockutils [req-c9a24d7b-2039-41b9-b021-5f0874b509b7 req-ad72e9a0-97e2-4e7e-8aec-784bb8d667db 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.659 2 DEBUG oslo_concurrency.lockutils [req-c9a24d7b-2039-41b9-b021-5f0874b509b7 req-ad72e9a0-97e2-4e7e-8aec-784bb8d667db 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.659 2 DEBUG nova.compute.manager [req-c9a24d7b-2039-41b9-b021-5f0874b509b7 req-ad72e9a0-97e2-4e7e-8aec-784bb8d667db 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] No waiting events found dispatching network-vif-unplugged-7a355384-4990-4aec-b3c5-fc480e9abef3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.660 2 WARNING nova.compute.manager [req-c9a24d7b-2039-41b9-b021-5f0874b509b7 req-ad72e9a0-97e2-4e7e-8aec-784bb8d667db 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received unexpected event network-vif-unplugged-7a355384-4990-4aec-b3c5-fc480e9abef3 for instance with vm_state active and task_state suspending.#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 systemd[1]: libpod-conmon-0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122.scope: Deactivated successfully.
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.697 2 DEBUG nova.compute.manager [None req-3e6d54bb-6e49-411b-9ac8-9892f8e58a42 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:47 np0005470441 podman[229622]: 2025-10-04 05:44:47.803428624 +0000 UTC m=+0.120094614 container remove 0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3)
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.812 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ee22de-15d3-4b27-9c9e-2707966ac52b]: (4, ('Sat Oct  4 05:44:47 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 (0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122)\n0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122\nSat Oct  4 05:44:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 (0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122)\n0a6b4b5b3eedbf54d63071bc2272ac36372e3f7088939b89532665861beef122\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.814 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9305d8aa-4477-4a9a-8b69-eefd5d28c972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.816 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap090b373d-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 kernel: tap090b373d-80: left promiscuous mode
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.838 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[492f4e87-88d1-49aa-ab1a-b6e88aeb8830]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.888 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3e44d1a8-c529-4a4f-a591-893b48379e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.890 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f2272329-f5a4-40d4-bbf6-ced519ed9732]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.902 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7e6ec9a9-506b-40c2-93e5-5432a42a39a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 456108, 'reachable_time': 25687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229649, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 systemd[1]: run-netns-ovnmeta\x2d090b373d\x2d8a67\x2d47ea\x2dadb2\x2d2b53df5e63e1.mount: Deactivated successfully.
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.905 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:44:47 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:47.905 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[83f70192-36ee-4832-8e6a-972acc7a3b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:47 np0005470441 nova_compute[192626]: 2025-10-04 05:44:47.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:49 np0005470441 podman[229650]: 2025-10-04 05:44:49.359330549 +0000 UTC m=+0.096094603 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.009 2 DEBUG nova.compute.manager [req-fa04774c-ded6-44c1-a97f-d76e51aa618e req-f430de72-45dc-4eff-baa0-d23a59cbd550 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.009 2 DEBUG oslo_concurrency.lockutils [req-fa04774c-ded6-44c1-a97f-d76e51aa618e req-f430de72-45dc-4eff-baa0-d23a59cbd550 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.010 2 DEBUG oslo_concurrency.lockutils [req-fa04774c-ded6-44c1-a97f-d76e51aa618e req-f430de72-45dc-4eff-baa0-d23a59cbd550 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.010 2 DEBUG oslo_concurrency.lockutils [req-fa04774c-ded6-44c1-a97f-d76e51aa618e req-f430de72-45dc-4eff-baa0-d23a59cbd550 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.010 2 DEBUG nova.compute.manager [req-fa04774c-ded6-44c1-a97f-d76e51aa618e req-f430de72-45dc-4eff-baa0-d23a59cbd550 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] No waiting events found dispatching network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.011 2 WARNING nova.compute.manager [req-fa04774c-ded6-44c1-a97f-d76e51aa618e req-f430de72-45dc-4eff-baa0-d23a59cbd550 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received unexpected event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 for instance with vm_state suspended and task_state None.#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.439 2 INFO nova.compute.manager [None req-61e199c0-9a5c-42db-a416-29dbe68f8e75 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Get console output#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.813 2 INFO nova.compute.manager [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Resuming#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.814 2 DEBUG nova.objects.instance [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'flavor' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.845 2 DEBUG oslo_concurrency.lockutils [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.846 2 DEBUG oslo_concurrency.lockutils [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquired lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:44:50 np0005470441 nova_compute[192626]: 2025-10-04 05:44:50.846 2 DEBUG nova.network.neutron [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:44:52 np0005470441 nova_compute[192626]: 2025-10-04 05:44:52.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:52 np0005470441 nova_compute[192626]: 2025-10-04 05:44:52.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.188 2 DEBUG nova.network.neutron [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updating instance_info_cache with network_info: [{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.212 2 DEBUG oslo_concurrency.lockutils [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Releasing lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.216 2 DEBUG nova.virt.libvirt.vif [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:44:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1596820660',display_name='tempest-TestNetworkAdvancedServerOps-server-1596820660',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1596820660',id=37,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC4wQ/X6/0SAB6Rc5KlTH9f3pyPVWOiQC4NvXftXMCtQ+LWE8QAb+Sd0w1/aKMwLu1E1D7bdBhUPrAOFVwtwvhneZiVX+1Wmdo4fjjO2BAL4vIvChQgt0wvW5z5SoXjMyQ==',key_name='tempest-TestNetworkAdvancedServerOps-1579178700',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:44:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-9ix08ofh',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:44:47Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=236c13b8-294d-472b-81b3-f3c6635a12ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.217 2 DEBUG nova.network.os_vif_util [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.217 2 DEBUG nova.network.os_vif_util [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.218 2 DEBUG os_vif [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.222 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a355384-49, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a355384-49, col_values=(('external_ids', {'iface-id': '7a355384-4990-4aec-b3c5-fc480e9abef3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:71:6d', 'vm-uuid': '236c13b8-294d-472b-81b3-f3c6635a12ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.223 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.224 2 INFO os_vif [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49')#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.249 2 DEBUG nova.objects.instance [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'numa_topology' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:44:54 np0005470441 kernel: tap7a355384-49: entered promiscuous mode
Oct  4 01:44:54 np0005470441 NetworkManager[51690]: <info>  [1759556694.3225] manager: (tap7a355384-49): new Tun device (/org/freedesktop/NetworkManager/Devices/129)
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:54Z|00285|binding|INFO|Claiming lport 7a355384-4990-4aec-b3c5-fc480e9abef3 for this chassis.
Oct  4 01:44:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:54Z|00286|binding|INFO|7a355384-4990-4aec-b3c5-fc480e9abef3: Claiming fa:16:3e:26:71:6d 10.100.0.5
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.333 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:71:6d 10.100.0.5'], port_security=['fa:16:3e:26:71:6d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '236c13b8-294d-472b-81b3-f3c6635a12ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '0dc79b17-6ee0-463e-830b-813165dac34e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.185'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=472c7ec5-1ea1-42f8-998b-9c7bafb7774e, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=7a355384-4990-4aec-b3c5-fc480e9abef3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.334 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 7a355384-4990-4aec-b3c5-fc480e9abef3 in datapath 090b373d-8a67-47ea-adb2-2b53df5e63e1 bound to our chassis#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.335 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 090b373d-8a67-47ea-adb2-2b53df5e63e1#033[00m
Oct  4 01:44:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:54Z|00287|binding|INFO|Setting lport 7a355384-4990-4aec-b3c5-fc480e9abef3 ovn-installed in OVS
Oct  4 01:44:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:54Z|00288|binding|INFO|Setting lport 7a355384-4990-4aec-b3c5-fc480e9abef3 up in Southbound
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.346 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aba8a391-ceb1-4c1d-bc5a-01ba848126a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.346 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap090b373d-81 in ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.349 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap090b373d-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.349 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d8d9e5-ec73-4d3d-ae2c-e687566991d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.349 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cc17f1d7-828f-415d-a936-c7460764b570]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 systemd-udevd[229692]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.361 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[fedeab63-cbc6-4807-8963-bfc18848fb55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 NetworkManager[51690]: <info>  [1759556694.3630] device (tap7a355384-49): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:44:54 np0005470441 NetworkManager[51690]: <info>  [1759556694.3644] device (tap7a355384-49): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:44:54 np0005470441 systemd-machined[152624]: New machine qemu-22-instance-00000025.
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.373 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c16c4c2b-630e-438a-bd66-d1d400113b8f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 systemd[1]: Started Virtual Machine qemu-22-instance-00000025.
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.398 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[51cb1707-87bd-43d5-af0c-cc4b0670e9cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 systemd-udevd[229696]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:44:54 np0005470441 NetworkManager[51690]: <info>  [1759556694.4044] manager: (tap090b373d-80): new Veth device (/org/freedesktop/NetworkManager/Devices/130)
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.404 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[970f9f19-8103-4f21-9c47-a203347a4fe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.435 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecaae60-e171-46f1-b309-67f2a2d9e990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.438 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[7d54aa32-bfec-46e9-bc5e-9acefcc974f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 NetworkManager[51690]: <info>  [1759556694.4647] device (tap090b373d-80): carrier: link connected
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.472 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[46e92e53-3253-419d-82dd-37661f458868]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.490 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b528d2e6-fcda-42fa-8b35-35d64b800ffb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap090b373d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:c9:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459212, 'reachable_time': 35443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229726, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.504 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b57c4472-bc14-49a2-894c-f0aa8e5dd82e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:c969'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459212, 'tstamp': 459212}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229727, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.521 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e139a0fe-146b-457a-9fba-28477a638a94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap090b373d-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:c9:69'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459212, 'reachable_time': 35443, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229728, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.554 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[82e3ebfb-b80a-48b9-9275-e89f327eb074]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.612 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[19568e3e-90a2-4584-91d0-f6d0223c402b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.613 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap090b373d-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.614 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.614 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap090b373d-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:54 np0005470441 NetworkManager[51690]: <info>  [1759556694.6173] manager: (tap090b373d-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Oct  4 01:44:54 np0005470441 kernel: tap090b373d-80: entered promiscuous mode
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.622 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap090b373d-80, col_values=(('external_ids', {'iface-id': '66e99394-913f-4e61-b79f-68494922e9dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 ovn_controller[94840]: 2025-10-04T05:44:54Z|00289|binding|INFO|Releasing lport 66e99394-913f-4e61-b79f-68494922e9dc from this chassis (sb_readonly=0)
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.625 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/090b373d-8a67-47ea-adb2-2b53df5e63e1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/090b373d-8a67-47ea-adb2-2b53df5e63e1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.628 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e1906b11-e040-4890-a0d5-65b29207abfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.629 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-090b373d-8a67-47ea-adb2-2b53df5e63e1
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/090b373d-8a67-47ea-adb2-2b53df5e63e1.pid.haproxy
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 090b373d-8a67-47ea-adb2-2b53df5e63e1
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:44:54 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:44:54.630 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'env', 'PROCESS_TAG=haproxy-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/090b373d-8a67-47ea-adb2-2b53df5e63e1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:44:54 np0005470441 nova_compute[192626]: 2025-10-04 05:44:54.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:55 np0005470441 podman[229767]: 2025-10-04 05:44:55.009801994 +0000 UTC m=+0.058362880 container create 5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:44:55 np0005470441 systemd[1]: Started libpod-conmon-5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37.scope.
Oct  4 01:44:55 np0005470441 podman[229767]: 2025-10-04 05:44:54.979150313 +0000 UTC m=+0.027711259 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:44:55 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:44:55 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a075980b18efb9e9bc775be1a991446403819ec45fe743b33feebd458c029e0c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.172 2 DEBUG nova.compute.manager [req-e28deb4c-68a2-4251-8d87-e8ebc4b3aa0b req-ea452894-e1fa-432f-b3c8-463883e0a013 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.173 2 DEBUG oslo_concurrency.lockutils [req-e28deb4c-68a2-4251-8d87-e8ebc4b3aa0b req-ea452894-e1fa-432f-b3c8-463883e0a013 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.173 2 DEBUG oslo_concurrency.lockutils [req-e28deb4c-68a2-4251-8d87-e8ebc4b3aa0b req-ea452894-e1fa-432f-b3c8-463883e0a013 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.174 2 DEBUG oslo_concurrency.lockutils [req-e28deb4c-68a2-4251-8d87-e8ebc4b3aa0b req-ea452894-e1fa-432f-b3c8-463883e0a013 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.174 2 DEBUG nova.compute.manager [req-e28deb4c-68a2-4251-8d87-e8ebc4b3aa0b req-ea452894-e1fa-432f-b3c8-463883e0a013 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] No waiting events found dispatching network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.174 2 WARNING nova.compute.manager [req-e28deb4c-68a2-4251-8d87-e8ebc4b3aa0b req-ea452894-e1fa-432f-b3c8-463883e0a013 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received unexpected event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 for instance with vm_state suspended and task_state resuming.#033[00m
Oct  4 01:44:55 np0005470441 podman[229767]: 2025-10-04 05:44:55.181834084 +0000 UTC m=+0.230394950 container init 5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:44:55 np0005470441 podman[229767]: 2025-10-04 05:44:55.188425091 +0000 UTC m=+0.236985957 container start 5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.207 2 DEBUG nova.virt.libvirt.host [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Removed pending event for 236c13b8-294d-472b-81b3-f3c6635a12ac due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.208 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556695.2068472, 236c13b8-294d-472b-81b3-f3c6635a12ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.208 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Started (Lifecycle Event)#033[00m
Oct  4 01:44:55 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [NOTICE]   (229787) : New worker (229789) forked
Oct  4 01:44:55 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [NOTICE]   (229787) : Loading success.
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.237 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.239 2 DEBUG nova.compute.manager [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.239 2 DEBUG nova.objects.instance [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'pci_devices' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.243 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.272 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.273 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556695.2199173, 236c13b8-294d-472b-81b3-f3c6635a12ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.273 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.277 2 INFO nova.virt.libvirt.driver [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance running successfully.#033[00m
Oct  4 01:44:55 np0005470441 virtqemud[192168]: argument unsupported: QEMU guest agent is not configured
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.280 2 DEBUG nova.virt.libvirt.guest [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.281 2 DEBUG nova.compute.manager [None req-6f2b74f6-39db-4686-a0ee-1e1fb644a0a2 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.296 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.298 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:44:55 np0005470441 nova_compute[192626]: 2025-10-04 05:44:55.330 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.297 2 DEBUG nova.compute.manager [req-1af71832-eff5-47f0-abf6-81bf7c3f9cc3 req-a8bf4c85-02b4-4a65-9b8a-9aacb2ec393f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.297 2 DEBUG oslo_concurrency.lockutils [req-1af71832-eff5-47f0-abf6-81bf7c3f9cc3 req-a8bf4c85-02b4-4a65-9b8a-9aacb2ec393f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.297 2 DEBUG oslo_concurrency.lockutils [req-1af71832-eff5-47f0-abf6-81bf7c3f9cc3 req-a8bf4c85-02b4-4a65-9b8a-9aacb2ec393f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.297 2 DEBUG oslo_concurrency.lockutils [req-1af71832-eff5-47f0-abf6-81bf7c3f9cc3 req-a8bf4c85-02b4-4a65-9b8a-9aacb2ec393f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.298 2 DEBUG nova.compute.manager [req-1af71832-eff5-47f0-abf6-81bf7c3f9cc3 req-a8bf4c85-02b4-4a65-9b8a-9aacb2ec393f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] No waiting events found dispatching network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.298 2 WARNING nova.compute.manager [req-1af71832-eff5-47f0-abf6-81bf7c3f9cc3 req-a8bf4c85-02b4-4a65-9b8a-9aacb2ec393f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received unexpected event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:44:57 np0005470441 podman[229798]: 2025-10-04 05:44:57.320897114 +0000 UTC m=+0.069573569 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:44:57 np0005470441 podman[229799]: 2025-10-04 05:44:57.335946071 +0000 UTC m=+0.074799397 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.759 2 INFO nova.compute.manager [None req-35b656c8-8594-43bb-b81c-fc0d96a098b0 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Get console output#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.765 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:44:57 np0005470441 nova_compute[192626]: 2025-10-04 05:44:57.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.724 2 DEBUG nova.compute.manager [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-changed-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.725 2 DEBUG nova.compute.manager [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Refreshing instance network info cache due to event network-changed-7a355384-4990-4aec-b3c5-fc480e9abef3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.726 2 DEBUG oslo_concurrency.lockutils [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.726 2 DEBUG oslo_concurrency.lockutils [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.726 2 DEBUG nova.network.neutron [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Refreshing network info cache for port 7a355384-4990-4aec-b3c5-fc480e9abef3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.852 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.853 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.853 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.854 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.854 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.855 2 INFO nova.compute.manager [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Terminating instance#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.856 2 DEBUG nova.compute.manager [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:45:00 np0005470441 kernel: tap7a355384-49 (unregistering): left promiscuous mode
Oct  4 01:45:00 np0005470441 NetworkManager[51690]: <info>  [1759556700.8858] device (tap7a355384-49): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:45:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:00Z|00290|binding|INFO|Releasing lport 7a355384-4990-4aec-b3c5-fc480e9abef3 from this chassis (sb_readonly=0)
Oct  4 01:45:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:00Z|00291|binding|INFO|Setting lport 7a355384-4990-4aec-b3c5-fc480e9abef3 down in Southbound
Oct  4 01:45:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:00Z|00292|binding|INFO|Removing iface tap7a355384-49 ovn-installed in OVS
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:00.904 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:71:6d 10.100.0.5'], port_security=['fa:16:3e:26:71:6d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '236c13b8-294d-472b-81b3-f3c6635a12ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0c087ea0f62444e80490916b42c760f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '0dc79b17-6ee0-463e-830b-813165dac34e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=472c7ec5-1ea1-42f8-998b-9c7bafb7774e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=7a355384-4990-4aec-b3c5-fc480e9abef3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:45:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:00.906 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 7a355384-4990-4aec-b3c5-fc480e9abef3 in datapath 090b373d-8a67-47ea-adb2-2b53df5e63e1 unbound from our chassis#033[00m
Oct  4 01:45:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:00.907 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 090b373d-8a67-47ea-adb2-2b53df5e63e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:45:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:00.908 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[92e1186e-d725-4c1d-a7f5-b981f3f5024f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:00.909 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 namespace which is not needed anymore#033[00m
Oct  4 01:45:00 np0005470441 nova_compute[192626]: 2025-10-04 05:45:00.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:00 np0005470441 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct  4 01:45:00 np0005470441 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000025.scope: Consumed 1.032s CPU time.
Oct  4 01:45:00 np0005470441 systemd-machined[152624]: Machine qemu-22-instance-00000025 terminated.
Oct  4 01:45:01 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [NOTICE]   (229787) : haproxy version is 2.8.14-c23fe91
Oct  4 01:45:01 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [NOTICE]   (229787) : path to executable is /usr/sbin/haproxy
Oct  4 01:45:01 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [WARNING]  (229787) : Exiting Master process...
Oct  4 01:45:01 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [WARNING]  (229787) : Exiting Master process...
Oct  4 01:45:01 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [ALERT]    (229787) : Current worker (229789) exited with code 143 (Terminated)
Oct  4 01:45:01 np0005470441 neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1[229783]: [WARNING]  (229787) : All workers exited. Exiting... (0)
Oct  4 01:45:01 np0005470441 systemd[1]: libpod-5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37.scope: Deactivated successfully.
Oct  4 01:45:01 np0005470441 podman[229864]: 2025-10-04 05:45:01.12022134 +0000 UTC m=+0.076049099 container died 5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.122 2 INFO nova.virt.libvirt.driver [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Instance destroyed successfully.#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.123 2 DEBUG nova.objects.instance [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lazy-loading 'resources' on Instance uuid 236c13b8-294d-472b-81b3-f3c6635a12ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.142 2 DEBUG nova.virt.libvirt.vif [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:44:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1596820660',display_name='tempest-TestNetworkAdvancedServerOps-server-1596820660',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1596820660',id=37,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC4wQ/X6/0SAB6Rc5KlTH9f3pyPVWOiQC4NvXftXMCtQ+LWE8QAb+Sd0w1/aKMwLu1E1D7bdBhUPrAOFVwtwvhneZiVX+1Wmdo4fjjO2BAL4vIvChQgt0wvW5z5SoXjMyQ==',key_name='tempest-TestNetworkAdvancedServerOps-1579178700',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:44:26Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d0c087ea0f62444e80490916b42c760f',ramdisk_id='',reservation_id='r-9ix08ofh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1635331179',owner_user_name='tempest-TestNetworkAdvancedServerOps-1635331179-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:44:55Z,user_data=None,user_id='d65c768451494a3f9e4f9a238fa5c40d',uuid=236c13b8-294d-472b-81b3-f3c6635a12ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.142 2 DEBUG nova.network.os_vif_util [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converting VIF {"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.143 2 DEBUG nova.network.os_vif_util [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.143 2 DEBUG os_vif [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.145 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.145 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a355384-49, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37-userdata-shm.mount: Deactivated successfully.
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:45:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay-a075980b18efb9e9bc775be1a991446403819ec45fe743b33feebd458c029e0c-merged.mount: Deactivated successfully.
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.151 2 INFO os_vif [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:71:6d,bridge_name='br-int',has_traffic_filtering=True,id=7a355384-4990-4aec-b3c5-fc480e9abef3,network=Network(090b373d-8a67-47ea-adb2-2b53df5e63e1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a355384-49')#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.152 2 INFO nova.virt.libvirt.driver [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Deleting instance files /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac_del#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.152 2 INFO nova.virt.libvirt.driver [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Deletion of /var/lib/nova/instances/236c13b8-294d-472b-81b3-f3c6635a12ac_del complete#033[00m
Oct  4 01:45:01 np0005470441 podman[229864]: 2025-10-04 05:45:01.160882449 +0000 UTC m=+0.116710198 container cleanup 5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:45:01 np0005470441 systemd[1]: libpod-conmon-5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37.scope: Deactivated successfully.
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.222 2 INFO nova.compute.manager [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.223 2 DEBUG oslo.service.loopingcall [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.224 2 DEBUG nova.compute.manager [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.224 2 DEBUG nova.network.neutron [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:45:01 np0005470441 podman[229909]: 2025-10-04 05:45:01.236501693 +0000 UTC m=+0.054577301 container remove 5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.242 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[82668e76-153e-4940-a8f1-aaa05975149c]: (4, ('Sat Oct  4 05:45:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 (5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37)\n5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37\nSat Oct  4 05:45:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 (5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37)\n5b7cb47d7475515720e7937caf0ce55637103ed55e443f3aff615e129e5b9a37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.244 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f1d1182-1cc8-4036-b9db-3b1a90f21a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.244 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap090b373d-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:01 np0005470441 kernel: tap090b373d-80: left promiscuous mode
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.251 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[52b25535-7321-451f-b0c2-b792d9cc7afa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 nova_compute[192626]: 2025-10-04 05:45:01.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.274 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce552f0-3afc-4ad8-874f-628148c6411d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.276 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d943494d-31f4-4e7c-a8ca-34b0a26008e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.290 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3a0b318d-1720-4a53-9244-b1ee38449838]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459204, 'reachable_time': 36328, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229925, 'error': None, 'target': 'ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.293 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-090b373d-8a67-47ea-adb2-2b53df5e63e1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:45:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:01.293 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[c1760a53-78ec-4c1c-a7f6-7b0af0a1e3fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:01 np0005470441 systemd[1]: run-netns-ovnmeta\x2d090b373d\x2d8a67\x2d47ea\x2dadb2\x2d2b53df5e63e1.mount: Deactivated successfully.
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.307 2 DEBUG nova.network.neutron [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:45:02 np0005470441 podman[229926]: 2025-10-04 05:45:02.323245872 +0000 UTC m=+0.070209930 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:45:02 np0005470441 podman[229927]: 2025-10-04 05:45:02.323404327 +0000 UTC m=+0.067605846 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.331 2 INFO nova.compute.manager [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Took 1.11 seconds to deallocate network for instance.#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.384 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.384 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.394 2 DEBUG nova.compute.manager [req-4dab1bd3-24b6-4c6d-8ecc-0c129ca12424 req-dab3b03e-6637-4419-b15d-ee9651f460b3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-deleted-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.604 2 DEBUG nova.compute.provider_tree [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.630 2 DEBUG nova.scheduler.client.report [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.651 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.720 2 INFO nova.scheduler.client.report [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Deleted allocations for instance 236c13b8-294d-472b-81b3-f3c6635a12ac#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.817 2 DEBUG oslo_concurrency.lockutils [None req-abdac861-b282-4ae5-adbb-bc7062a0bfa9 d65c768451494a3f9e4f9a238fa5c40d d0c087ea0f62444e80490916b42c760f - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.828 2 DEBUG nova.compute.manager [req-1905ebcb-4f8a-4d4c-949a-cb39ccc377ec req-6964e732-cbf6-4833-ab69-2922f937361a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.828 2 DEBUG oslo_concurrency.lockutils [req-1905ebcb-4f8a-4d4c-949a-cb39ccc377ec req-6964e732-cbf6-4833-ab69-2922f937361a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.829 2 DEBUG oslo_concurrency.lockutils [req-1905ebcb-4f8a-4d4c-949a-cb39ccc377ec req-6964e732-cbf6-4833-ab69-2922f937361a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.829 2 DEBUG oslo_concurrency.lockutils [req-1905ebcb-4f8a-4d4c-949a-cb39ccc377ec req-6964e732-cbf6-4833-ab69-2922f937361a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "236c13b8-294d-472b-81b3-f3c6635a12ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.829 2 DEBUG nova.compute.manager [req-1905ebcb-4f8a-4d4c-949a-cb39ccc377ec req-6964e732-cbf6-4833-ab69-2922f937361a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] No waiting events found dispatching network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.829 2 WARNING nova.compute.manager [req-1905ebcb-4f8a-4d4c-949a-cb39ccc377ec req-6964e732-cbf6-4833-ab69-2922f937361a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Received unexpected event network-vif-plugged-7a355384-4990-4aec-b3c5-fc480e9abef3 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:45:02 np0005470441 nova_compute[192626]: 2025-10-04 05:45:02.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:03 np0005470441 nova_compute[192626]: 2025-10-04 05:45:03.252 2 DEBUG nova.network.neutron [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updated VIF entry in instance network info cache for port 7a355384-4990-4aec-b3c5-fc480e9abef3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:45:03 np0005470441 nova_compute[192626]: 2025-10-04 05:45:03.253 2 DEBUG nova.network.neutron [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Updating instance_info_cache with network_info: [{"id": "7a355384-4990-4aec-b3c5-fc480e9abef3", "address": "fa:16:3e:26:71:6d", "network": {"id": "090b373d-8a67-47ea-adb2-2b53df5e63e1", "bridge": "br-int", "label": "tempest-network-smoke--504196849", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0c087ea0f62444e80490916b42c760f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a355384-49", "ovs_interfaceid": "7a355384-4990-4aec-b3c5-fc480e9abef3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:45:03 np0005470441 nova_compute[192626]: 2025-10-04 05:45:03.280 2 DEBUG oslo_concurrency.lockutils [req-5b1a31b9-3ff9-4191-bf9a-43d2e2c3ee42 req-c9fadb5b-13d7-4951-b62e-f8ea30a805fc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-236c13b8-294d-472b-81b3-f3c6635a12ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:45:04 np0005470441 nova_compute[192626]: 2025-10-04 05:45:04.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:04 np0005470441 nova_compute[192626]: 2025-10-04 05:45:04.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:05 np0005470441 nova_compute[192626]: 2025-10-04 05:45:05.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:05 np0005470441 nova_compute[192626]: 2025-10-04 05:45:05.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:45:06 np0005470441 nova_compute[192626]: 2025-10-04 05:45:06.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:06 np0005470441 nova_compute[192626]: 2025-10-04 05:45:06.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:06 np0005470441 nova_compute[192626]: 2025-10-04 05:45:06.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:06.752 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:06.753 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:06.753 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:07 np0005470441 nova_compute[192626]: 2025-10-04 05:45:07.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:09 np0005470441 podman[229965]: 2025-10-04 05:45:09.334576835 +0000 UTC m=+0.086384086 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_id=edpm, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., distribution-scope=public)
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.732 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.751 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.752 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.752 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.752 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.909 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.910 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5690MB free_disk=73.42475128173828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.910 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.911 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.972 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.972 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:45:09 np0005470441 nova_compute[192626]: 2025-10-04 05:45:09.990 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:45:10 np0005470441 nova_compute[192626]: 2025-10-04 05:45:10.005 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:45:10 np0005470441 nova_compute[192626]: 2025-10-04 05:45:10.029 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:45:10 np0005470441 nova_compute[192626]: 2025-10-04 05:45:10.030 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:10 np0005470441 nova_compute[192626]: 2025-10-04 05:45:10.030 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:10 np0005470441 nova_compute[192626]: 2025-10-04 05:45:10.030 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 01:45:10 np0005470441 nova_compute[192626]: 2025-10-04 05:45:10.045 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 01:45:11 np0005470441 nova_compute[192626]: 2025-10-04 05:45:11.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:11 np0005470441 nova_compute[192626]: 2025-10-04 05:45:11.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:11 np0005470441 nova_compute[192626]: 2025-10-04 05:45:11.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 01:45:12 np0005470441 nova_compute[192626]: 2025-10-04 05:45:12.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:12 np0005470441 nova_compute[192626]: 2025-10-04 05:45:12.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:14 np0005470441 nova_compute[192626]: 2025-10-04 05:45:14.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:14 np0005470441 nova_compute[192626]: 2025-10-04 05:45:14.740 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:15 np0005470441 podman[229988]: 2025-10-04 05:45:15.346318027 +0000 UTC m=+0.081095994 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:45:16 np0005470441 nova_compute[192626]: 2025-10-04 05:45:16.120 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556701.1193464, 236c13b8-294d-472b-81b3-f3c6635a12ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:45:16 np0005470441 nova_compute[192626]: 2025-10-04 05:45:16.120 2 INFO nova.compute.manager [-] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:45:16 np0005470441 nova_compute[192626]: 2025-10-04 05:45:16.140 2 DEBUG nova.compute.manager [None req-a2692071-3260-44dc-88dc-03d9ea0f0cad - - - - - -] [instance: 236c13b8-294d-472b-81b3-f3c6635a12ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:45:16 np0005470441 nova_compute[192626]: 2025-10-04 05:45:16.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:16 np0005470441 nova_compute[192626]: 2025-10-04 05:45:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:16 np0005470441 nova_compute[192626]: 2025-10-04 05:45:16.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:17 np0005470441 podman[230012]: 2025-10-04 05:45:17.327562684 +0000 UTC m=+0.069156030 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent)
Oct  4 01:45:17 np0005470441 nova_compute[192626]: 2025-10-04 05:45:17.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:20 np0005470441 podman[230031]: 2025-10-04 05:45:20.348301033 +0000 UTC m=+0.101826000 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  4 01:45:21 np0005470441 nova_compute[192626]: 2025-10-04 05:45:21.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:22 np0005470441 nova_compute[192626]: 2025-10-04 05:45:22.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:26 np0005470441 nova_compute[192626]: 2025-10-04 05:45:26.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:26.329 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:2a:5a 10.100.0.2 2001:db8::f816:3eff:fea2:2a5a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea2:2a5a/64', 'neutron:device_id': 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fc4abf8-8cf7-4116-a194-254fe81a88ac, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3011c29a-41c7-42ec-b849-284b7adb1cc8) old=Port_Binding(mac=['fa:16:3e:a2:2a:5a 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:45:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:26.330 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3011c29a-41c7-42ec-b849-284b7adb1cc8 in datapath 7276fadd-7b41-4e61-aace-db1bca5ce8f0 updated#033[00m
Oct  4 01:45:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:26.331 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7276fadd-7b41-4e61-aace-db1bca5ce8f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:45:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:26.332 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1ade998d-b500-4ba2-938c-972820fa673f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:27 np0005470441 nova_compute[192626]: 2025-10-04 05:45:27.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:28 np0005470441 podman[230057]: 2025-10-04 05:45:28.316229433 +0000 UTC m=+0.068306145 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:45:28 np0005470441 podman[230058]: 2025-10-04 05:45:28.323057179 +0000 UTC m=+0.061990784 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:45:31 np0005470441 nova_compute[192626]: 2025-10-04 05:45:31.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:31 np0005470441 nova_compute[192626]: 2025-10-04 05:45:31.475 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:45:32 np0005470441 nova_compute[192626]: 2025-10-04 05:45:32.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:33 np0005470441 podman[230104]: 2025-10-04 05:45:33.322209404 +0000 UTC m=+0.078041826 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  4 01:45:33 np0005470441 podman[230105]: 2025-10-04 05:45:33.322940175 +0000 UTC m=+0.074253867 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.518 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.519 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.549 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.693 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.694 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.702 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.703 2 INFO nova.compute.claims [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.868 2 DEBUG nova.compute.provider_tree [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.886 2 DEBUG nova.scheduler.client.report [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.910 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.911 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.962 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.964 2 DEBUG nova.network.neutron [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:45:34 np0005470441 nova_compute[192626]: 2025-10-04 05:45:34.995 2 INFO nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.016 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:45:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:35.135 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:45:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:35.137 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.186 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.187 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.187 2 INFO nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Creating image(s)#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.188 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.188 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.189 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.205 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.286 2 DEBUG nova.policy [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.291 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.292 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.293 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.309 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.363 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.363 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.393 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.394 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.394 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.482 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.482 2 DEBUG nova.virt.disk.api [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.483 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.575 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.576 2 DEBUG nova.virt.disk.api [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.577 2 DEBUG nova.objects.instance [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid 776da40d-5353-42f4-98f8-13b045395ff0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.591 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.592 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Ensure instance console log exists: /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.593 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.594 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:35 np0005470441 nova_compute[192626]: 2025-10-04 05:45:35.594 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:36 np0005470441 nova_compute[192626]: 2025-10-04 05:45:36.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:36 np0005470441 nova_compute[192626]: 2025-10-04 05:45:36.453 2 DEBUG nova.network.neutron [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Successfully created port: 4066fc4d-fc40-4b04-b760-5c297bb4e954 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:45:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:37.139 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:38 np0005470441 nova_compute[192626]: 2025-10-04 05:45:38.023 2 DEBUG nova.network.neutron [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Successfully updated port: 4066fc4d-fc40-4b04-b760-5c297bb4e954 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:45:38 np0005470441 nova_compute[192626]: 2025-10-04 05:45:38.037 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:45:38 np0005470441 nova_compute[192626]: 2025-10-04 05:45:38.037 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:45:38 np0005470441 nova_compute[192626]: 2025-10-04 05:45:38.037 2 DEBUG nova.network.neutron [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:45:38 np0005470441 nova_compute[192626]: 2025-10-04 05:45:38.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:38 np0005470441 nova_compute[192626]: 2025-10-04 05:45:38.146 2 DEBUG nova.network.neutron [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.310 2 DEBUG nova.network.neutron [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updating instance_info_cache with network_info: [{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.413 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.413 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Instance network_info: |[{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.416 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Start _get_guest_xml network_info=[{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.424 2 DEBUG nova.compute.manager [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-changed-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.425 2 DEBUG nova.compute.manager [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Refreshing instance network info cache due to event network-changed-4066fc4d-fc40-4b04-b760-5c297bb4e954. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.425 2 DEBUG oslo_concurrency.lockutils [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.425 2 DEBUG oslo_concurrency.lockutils [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.426 2 DEBUG nova.network.neutron [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Refreshing network info cache for port 4066fc4d-fc40-4b04-b760-5c297bb4e954 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.428 2 WARNING nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.433 2 DEBUG nova.virt.libvirt.host [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.434 2 DEBUG nova.virt.libvirt.host [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.438 2 DEBUG nova.virt.libvirt.host [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.438 2 DEBUG nova.virt.libvirt.host [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.439 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.440 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.440 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.440 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.441 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.441 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.441 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.441 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.442 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.442 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.442 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.442 2 DEBUG nova.virt.hardware [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.446 2 DEBUG nova.virt.libvirt.vif [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:45:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1851043624',display_name='tempest-TestGettingAddress-server-1851043624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1851043624',id=40,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+fumAMK4eGampQeF4hlruIcaaRBYmbhgeNl6SetwJUPy46+iiikMXv7AerAURtoVhgrIfhgBJ4psNsdbxnYSv6jDkmxEGB9TsMqxJSFd3pHcMBAZrUSncmZpDks1nzzg==',key_name='tempest-TestGettingAddress-788686438',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-jpm0oom4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:45:35Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=776da40d-5353-42f4-98f8-13b045395ff0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.446 2 DEBUG nova.network.os_vif_util [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.447 2 DEBUG nova.network.os_vif_util [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.447 2 DEBUG nova.objects.instance [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 776da40d-5353-42f4-98f8-13b045395ff0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.478 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <uuid>776da40d-5353-42f4-98f8-13b045395ff0</uuid>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <name>instance-00000028</name>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-1851043624</nova:name>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:45:39</nova:creationTime>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        <nova:port uuid="4066fc4d-fc40-4b04-b760-5c297bb4e954">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe73:872c" ipVersion="6"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <entry name="serial">776da40d-5353-42f4-98f8-13b045395ff0</entry>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <entry name="uuid">776da40d-5353-42f4-98f8-13b045395ff0</entry>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.config"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:73:87:2c"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <target dev="tap4066fc4d-fc"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/console.log" append="off"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:45:39 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:45:39 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:45:39 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:45:39 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.479 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Preparing to wait for external event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.480 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.480 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.481 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.481 2 DEBUG nova.virt.libvirt.vif [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:45:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1851043624',display_name='tempest-TestGettingAddress-server-1851043624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1851043624',id=40,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+fumAMK4eGampQeF4hlruIcaaRBYmbhgeNl6SetwJUPy46+iiikMXv7AerAURtoVhgrIfhgBJ4psNsdbxnYSv6jDkmxEGB9TsMqxJSFd3pHcMBAZrUSncmZpDks1nzzg==',key_name='tempest-TestGettingAddress-788686438',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-jpm0oom4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:45:35Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=776da40d-5353-42f4-98f8-13b045395ff0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.482 2 DEBUG nova.network.os_vif_util [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.483 2 DEBUG nova.network.os_vif_util [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.483 2 DEBUG os_vif [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4066fc4d-fc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.488 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4066fc4d-fc, col_values=(('external_ids', {'iface-id': '4066fc4d-fc40-4b04-b760-5c297bb4e954', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:87:2c', 'vm-uuid': '776da40d-5353-42f4-98f8-13b045395ff0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:39 np0005470441 NetworkManager[51690]: <info>  [1759556739.4913] manager: (tap4066fc4d-fc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.503 2 INFO os_vif [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc')#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.626 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.626 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.626 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:73:87:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:45:39 np0005470441 nova_compute[192626]: 2025-10-04 05:45:39.627 2 INFO nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Using config drive#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.156 2 INFO nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Creating config drive at /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.config#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.160 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9smots6s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.281 2 DEBUG oslo_concurrency.processutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9smots6s" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:45:40 np0005470441 podman[230161]: 2025-10-04 05:45:40.295667717 +0000 UTC m=+0.054455667 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct  4 01:45:40 np0005470441 kernel: tap4066fc4d-fc: entered promiscuous mode
Oct  4 01:45:40 np0005470441 NetworkManager[51690]: <info>  [1759556740.3443] manager: (tap4066fc4d-fc): new Tun device (/org/freedesktop/NetworkManager/Devices/133)
Oct  4 01:45:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:40Z|00293|binding|INFO|Claiming lport 4066fc4d-fc40-4b04-b760-5c297bb4e954 for this chassis.
Oct  4 01:45:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:40Z|00294|binding|INFO|4066fc4d-fc40-4b04-b760-5c297bb4e954: Claiming fa:16:3e:73:87:2c 10.100.0.9 2001:db8::f816:3eff:fe73:872c
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 systemd-udevd[230195]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.378 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:87:2c 10.100.0.9 2001:db8::f816:3eff:fe73:872c'], port_security=['fa:16:3e:73:87:2c 10.100.0.9 2001:db8::f816:3eff:fe73:872c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe73:872c/64', 'neutron:device_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '35a5117b-c0ca-43df-9c9c-3bd64cd71b3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fc4abf8-8cf7-4116-a194-254fe81a88ac, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=4066fc4d-fc40-4b04-b760-5c297bb4e954) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.379 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 4066fc4d-fc40-4b04-b760-5c297bb4e954 in datapath 7276fadd-7b41-4e61-aace-db1bca5ce8f0 bound to our chassis#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.380 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7276fadd-7b41-4e61-aace-db1bca5ce8f0#033[00m
Oct  4 01:45:40 np0005470441 NetworkManager[51690]: <info>  [1759556740.3909] device (tap4066fc4d-fc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:45:40 np0005470441 NetworkManager[51690]: <info>  [1759556740.3937] device (tap4066fc4d-fc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.393 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[41775588-abcf-47a4-8da3-395b2b061c43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.394 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7276fadd-71 in ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:45:40 np0005470441 systemd-machined[152624]: New machine qemu-23-instance-00000028.
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.397 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7276fadd-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.397 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c664db15-e7da-42c0-acb1-2c86cb2ab906]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.398 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[77aea90d-15ec-4e49-87ee-a5896d196966]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.409 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[64cebffc-f6e5-4262-b4ec-c0b72920bdec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:40Z|00295|binding|INFO|Setting lport 4066fc4d-fc40-4b04-b760-5c297bb4e954 ovn-installed in OVS
Oct  4 01:45:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:40Z|00296|binding|INFO|Setting lport 4066fc4d-fc40-4b04-b760-5c297bb4e954 up in Southbound
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 systemd[1]: Started Virtual Machine qemu-23-instance-00000028.
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.431 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e54136d3-43b4-4974-b4df-ef642cf98d0b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.456 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[00742b24-b039-4996-a522-5ec2ed70a782]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.462 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b410db-1fab-4b38-b8d3-8914daee28fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 systemd-udevd[230200]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:45:40 np0005470441 NetworkManager[51690]: <info>  [1759556740.4628] manager: (tap7276fadd-70): new Veth device (/org/freedesktop/NetworkManager/Devices/134)
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.493 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[5adac891-c22b-4c7e-9e61-8dc14d0551d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.499 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[37ee872a-7f66-43fa-93cf-d24f8d7e5c79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 NetworkManager[51690]: <info>  [1759556740.5210] device (tap7276fadd-70): carrier: link connected
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.528 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8906a0-77cf-4bf5-8825-5e34a4c3a2f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.542 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[82634604-ee2a-402c-957f-e92c08e09fb0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7276fadd-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:2a:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463817, 'reachable_time': 15264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230230, 'error': None, 'target': 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.554 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6215adec-d99b-4d56-862d-4952faec05a1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea2:2a5a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463817, 'tstamp': 463817}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230231, 'error': None, 'target': 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.566 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6c32a92e-bc42-48e4-9cb3-97039fc223a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7276fadd-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a2:2a:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463817, 'reachable_time': 15264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230232, 'error': None, 'target': 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.588 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2c25cf90-2798-4e73-a8bb-50b85b378223]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.630 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8628017e-2395-4f04-a731-31e24105a608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.631 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7276fadd-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.631 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.631 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7276fadd-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 NetworkManager[51690]: <info>  [1759556740.6334] manager: (tap7276fadd-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Oct  4 01:45:40 np0005470441 kernel: tap7276fadd-70: entered promiscuous mode
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.640 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7276fadd-70, col_values=(('external_ids', {'iface-id': '3011c29a-41c7-42ec-b849-284b7adb1cc8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:40Z|00297|binding|INFO|Releasing lport 3011c29a-41c7-42ec-b849-284b7adb1cc8 from this chassis (sb_readonly=0)
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.656 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7276fadd-7b41-4e61-aace-db1bca5ce8f0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7276fadd-7b41-4e61-aace-db1bca5ce8f0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.656 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5a6385-32c3-4945-93d0-b8d4df755fe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.657 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-7276fadd-7b41-4e61-aace-db1bca5ce8f0
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/7276fadd-7b41-4e61-aace-db1bca5ce8f0.pid.haproxy
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 7276fadd-7b41-4e61-aace-db1bca5ce8f0
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:45:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:45:40.657 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'env', 'PROCESS_TAG=haproxy-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7276fadd-7b41-4e61-aace-db1bca5ce8f0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.761 2 DEBUG nova.compute.manager [req-c5744271-461f-4b0a-9640-07b11a09b708 req-266f09d5-4437-4275-85c5-b73b3f5cde5c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.761 2 DEBUG oslo_concurrency.lockutils [req-c5744271-461f-4b0a-9640-07b11a09b708 req-266f09d5-4437-4275-85c5-b73b3f5cde5c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.762 2 DEBUG oslo_concurrency.lockutils [req-c5744271-461f-4b0a-9640-07b11a09b708 req-266f09d5-4437-4275-85c5-b73b3f5cde5c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.762 2 DEBUG oslo_concurrency.lockutils [req-c5744271-461f-4b0a-9640-07b11a09b708 req-266f09d5-4437-4275-85c5-b73b3f5cde5c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:40 np0005470441 nova_compute[192626]: 2025-10-04 05:45:40.762 2 DEBUG nova.compute.manager [req-c5744271-461f-4b0a-9640-07b11a09b708 req-266f09d5-4437-4275-85c5-b73b3f5cde5c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Processing event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:45:41 np0005470441 podman[230270]: 2025-10-04 05:45:41.050636852 +0000 UTC m=+0.053464219 container create b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:45:41 np0005470441 systemd[1]: Started libpod-conmon-b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32.scope.
Oct  4 01:45:41 np0005470441 podman[230270]: 2025-10-04 05:45:41.020375971 +0000 UTC m=+0.023203358 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:45:41 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:45:41 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7c98ee95f3e061b611bf473f78c44c02a4be074df08f06578dd8a28e4f907180/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:45:41 np0005470441 podman[230270]: 2025-10-04 05:45:41.134966958 +0000 UTC m=+0.137794325 container init b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:45:41 np0005470441 podman[230270]: 2025-10-04 05:45:41.14236212 +0000 UTC m=+0.145189497 container start b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:45:41 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [NOTICE]   (230289) : New worker (230291) forked
Oct  4 01:45:41 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [NOTICE]   (230289) : Loading success.
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.246 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.247 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556741.2459657, 776da40d-5353-42f4-98f8-13b045395ff0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.248 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] VM Started (Lifecycle Event)#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.251 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.255 2 INFO nova.virt.libvirt.driver [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Instance spawned successfully.#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.256 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.274 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.279 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.282 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.283 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.283 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.284 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.284 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.284 2 DEBUG nova.virt.libvirt.driver [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.316 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.317 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556741.2471728, 776da40d-5353-42f4-98f8-13b045395ff0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.317 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.352 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.356 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556741.2504432, 776da40d-5353-42f4-98f8-13b045395ff0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.356 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.361 2 INFO nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Took 6.17 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.361 2 DEBUG nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.376 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.381 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.405 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.429 2 INFO nova.compute.manager [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Took 6.77 seconds to build instance.#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.452 2 DEBUG oslo_concurrency.lockutils [None req-dc101d8c-cef8-4db6-b4eb-6ba00f739ec0 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.933s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.580 2 DEBUG nova.network.neutron [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updated VIF entry in instance network info cache for port 4066fc4d-fc40-4b04-b760-5c297bb4e954. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.580 2 DEBUG nova.network.neutron [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updating instance_info_cache with network_info: [{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:45:41 np0005470441 nova_compute[192626]: 2025-10-04 05:45:41.610 2 DEBUG oslo_concurrency.lockutils [req-00445c8c-5648-4ab3-82c9-ee36c3555c0c req-4b400b86-b757-4c87-8d5e-423746c0717e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:45:42 np0005470441 nova_compute[192626]: 2025-10-04 05:45:42.840 2 DEBUG nova.compute.manager [req-e0f88e94-69d9-4f0b-9b67-4249d84c5407 req-83caee37-315f-499b-b1bf-9ae74773cd54 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:42 np0005470441 nova_compute[192626]: 2025-10-04 05:45:42.841 2 DEBUG oslo_concurrency.lockutils [req-e0f88e94-69d9-4f0b-9b67-4249d84c5407 req-83caee37-315f-499b-b1bf-9ae74773cd54 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:45:42 np0005470441 nova_compute[192626]: 2025-10-04 05:45:42.842 2 DEBUG oslo_concurrency.lockutils [req-e0f88e94-69d9-4f0b-9b67-4249d84c5407 req-83caee37-315f-499b-b1bf-9ae74773cd54 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:45:42 np0005470441 nova_compute[192626]: 2025-10-04 05:45:42.842 2 DEBUG oslo_concurrency.lockutils [req-e0f88e94-69d9-4f0b-9b67-4249d84c5407 req-83caee37-315f-499b-b1bf-9ae74773cd54 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:45:42 np0005470441 nova_compute[192626]: 2025-10-04 05:45:42.843 2 DEBUG nova.compute.manager [req-e0f88e94-69d9-4f0b-9b67-4249d84c5407 req-83caee37-315f-499b-b1bf-9ae74773cd54 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] No waiting events found dispatching network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:45:42 np0005470441 nova_compute[192626]: 2025-10-04 05:45:42.843 2 WARNING nova.compute.manager [req-e0f88e94-69d9-4f0b-9b67-4249d84c5407 req-83caee37-315f-499b-b1bf-9ae74773cd54 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received unexpected event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:45:43 np0005470441 nova_compute[192626]: 2025-10-04 05:45:43.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:44 np0005470441 NetworkManager[51690]: <info>  [1759556744.3198] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Oct  4 01:45:44 np0005470441 NetworkManager[51690]: <info>  [1759556744.3217] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:44 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:44Z|00298|binding|INFO|Releasing lport 3011c29a-41c7-42ec-b849-284b7adb1cc8 from this chassis (sb_readonly=0)
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.931 2 DEBUG nova.compute.manager [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-changed-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.932 2 DEBUG nova.compute.manager [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Refreshing instance network info cache due to event network-changed-4066fc4d-fc40-4b04-b760-5c297bb4e954. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.933 2 DEBUG oslo_concurrency.lockutils [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.933 2 DEBUG oslo_concurrency.lockutils [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:45:44 np0005470441 nova_compute[192626]: 2025-10-04 05:45:44.934 2 DEBUG nova.network.neutron [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Refreshing network info cache for port 4066fc4d-fc40-4b04-b760-5c297bb4e954 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:45:46 np0005470441 nova_compute[192626]: 2025-10-04 05:45:46.313 2 DEBUG nova.network.neutron [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updated VIF entry in instance network info cache for port 4066fc4d-fc40-4b04-b760-5c297bb4e954. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:45:46 np0005470441 nova_compute[192626]: 2025-10-04 05:45:46.314 2 DEBUG nova.network.neutron [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updating instance_info_cache with network_info: [{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:45:46 np0005470441 podman[230301]: 2025-10-04 05:45:46.315712366 +0000 UTC m=+0.062925181 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:45:46 np0005470441 nova_compute[192626]: 2025-10-04 05:45:46.335 2 DEBUG oslo_concurrency.lockutils [req-e283bc92-c1aa-4713-9e1f-03d4c9f4310b req-7de385f7-4204-4c0e-a4f7-4927e829abf8 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:45:48 np0005470441 nova_compute[192626]: 2025-10-04 05:45:48.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:48 np0005470441 podman[230326]: 2025-10-04 05:45:48.322495959 +0000 UTC m=+0.071321023 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  4 01:45:49 np0005470441 nova_compute[192626]: 2025-10-04 05:45:49.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:51 np0005470441 podman[230347]: 2025-10-04 05:45:51.354369588 +0000 UTC m=+0.097222366 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:45:53 np0005470441 nova_compute[192626]: 2025-10-04 05:45:53.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:53 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:53Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:73:87:2c 10.100.0.9
Oct  4 01:45:53 np0005470441 ovn_controller[94840]: 2025-10-04T05:45:53Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:73:87:2c 10.100.0.9
Oct  4 01:45:54 np0005470441 nova_compute[192626]: 2025-10-04 05:45:54.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:58 np0005470441 nova_compute[192626]: 2025-10-04 05:45:58.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:45:59 np0005470441 podman[230385]: 2025-10-04 05:45:59.304403735 +0000 UTC m=+0.054288672 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:45:59 np0005470441 podman[230384]: 2025-10-04 05:45:59.305225519 +0000 UTC m=+0.060933134 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Oct  4 01:45:59 np0005470441 nova_compute[192626]: 2025-10-04 05:45:59.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.714 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '776da40d-5353-42f4-98f8-13b045395ff0', 'name': 'tempest-TestGettingAddress-server-1851043624', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000028', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3993802d0c4a44febb9b33931e51db84', 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'hostId': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.715 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.716 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.716 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>]
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.717 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.720 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 776da40d-5353-42f4-98f8-13b045395ff0 / tap4066fc4d-fc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.721 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e16de5e-c86e-4859-9cce-2a09eddc6b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.717453', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '698bf7ae-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': '4d6f4a0948f765fdbc40ac20ded97358785579426e8fdcc350966ee3396a1788'}]}, 'timestamp': '2025-10-04 05:46:02.722429', '_unique_id': '67f3e21d487845fba5604fea5e633d0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.724 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.725 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.754 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.write.latency volume: 2046958879 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.755 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b75123e3-da30-4e0c-87ca-145291f805a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2046958879, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.725855', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6991080c-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': '3812552a6b4c3abebb2b48de2d889ca77dbe14c5a18b959e09539c8a72eaf1f1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.725855', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699124f4-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': '3f7b93c4858568c2ec448764b1df3c13cc89a71a55a5f8c07f53c298e43fa79e'}]}, 'timestamp': '2025-10-04 05:46:02.756283', '_unique_id': '4f774500b15c4b6c91ab74cf7e85c9f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.757 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.759 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.759 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88e03f90-6303-4cd6-835f-7c38acee5b13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.759494', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '6991bd10-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': '519d4f0b8a396c3a21b702109ebdb9b93fb6b0e95c0d6273c6a0d882d6fe9532'}]}, 'timestamp': '2025-10-04 05:46:02.760282', '_unique_id': 'd1084f25675541cbbd60e7aa54a3cc5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.761 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.762 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.779 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.780 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f2297a8-4724-4f82-a580-2186a44066b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.763027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6994e0f8-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.477620082, 'message_signature': '4f82e0b6af17b5c8d2c61455cc858f554e201d5a30d418046e03c70e76501290'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.763027', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69950092-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.477620082, 'message_signature': 'f929e6e0e49fc9f8f75285400dc4fe89506e9ee60ca5ebf862a44835e9b490f4'}]}, 'timestamp': '2025-10-04 05:46:02.781691', '_unique_id': 'bdfa15d4d08f43f58f4661d9123e8b74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.783 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.785 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '091ffc86-4825-48ff-a984-f963b5c8ac3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.785455', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '6995af60-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': '3f336eb39fa31484f63447ff2236404d4991808b51be2770552f294d4c07df03'}]}, 'timestamp': '2025-10-04 05:46:02.786055', '_unique_id': 'bb76b983484041f585d9495a89f3662d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.787 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.788 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.789 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56b99cda-03f8-48d4-a419-4c80ea1bdf74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.789033', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '69963b1a-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': 'a65ca16803176130803835370f1dc52bb6fda9826db473da8346dfae91c516eb'}]}, 'timestamp': '2025-10-04 05:46:02.789634', '_unique_id': 'c43dec8f04f24eb6becc1126856db347'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.790 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.791 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.792 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.read.bytes volume: 31386112 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.792 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '519a7294-7b8c-48aa-b079-4f3c7f3b5401', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31386112, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.792060', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6996afd2-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'c06348562e9ac175074edede6e2ea1b889f1f0d60a41c0404393d72762649a0e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.792060', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6996c38c-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': '831e43f07480aeefc9e47027a9a0584cdf145295aa1bfbd28cef3da8bbcbb1eb'}]}, 'timestamp': '2025-10-04 05:46:02.793059', '_unique_id': '5b52686359bf4f56a3ce34faa4aebabe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.794 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.795 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.795 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.796 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>]
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.796 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.796 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.write.requests volume: 318 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.797 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '252a07b7-56ec-485b-95a0-a8bc66d5536a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 318, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.796581', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69975ff4-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'a18de4d3726615def2ffe10c9c648dcdb5c4996b1d986e4ec84ef312eb609b6f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.796581', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699773b8-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'f97f5824c495e0cc85c0592369b27267025625f1535da351e9c9dc008a39171e'}]}, 'timestamp': '2025-10-04 05:46:02.797602', '_unique_id': 'ebc42a97328b4a7a8cc8e250e9f682e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.798 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.799 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.800 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.incoming.bytes volume: 4613 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '941407d3-4c1a-4240-bd67-1bd1314e8066', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4613, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.800116', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '6997ea78-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': 'ce2555bdd0d6a5a9b481a3008063a11e5e2c9040c9e89713396d8e5241c96e87'}]}, 'timestamp': '2025-10-04 05:46:02.800682', '_unique_id': '9857faa3082d47f7b8615811327e3623'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.801 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.802 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.802 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>]
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.802 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.803 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.read.latency volume: 673593615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.803 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.read.latency volume: 45709874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0cf57ce8-04a2-4694-870a-4e942e83d3d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 673593615, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.803075', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69985b70-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': '2bd47dc613c039f58b7008b0526e13e6f8db51f1a1fa6ab3fb1c24b32e158c37'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 45709874, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.803075', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69986bf6-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'eed12cc522b5c7ba8a20575434ca923a291fb9a774cc3e6ad2bbfb0d9820c346'}]}, 'timestamp': '2025-10-04 05:46:02.803898', '_unique_id': '88614f081bb2418da222ab4b7c039292'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.804 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.805 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.806 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17bd6120-11f4-4c10-85c4-35513b1f642d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.805876', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '6998c6fa-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.477620082, 'message_signature': '81eb74b5eb4140a0b55b1b93c0c96eee64decad5cca362730c9990d703a54a2c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.805876', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '6998d17c-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.477620082, 'message_signature': '03618a4f63fe60b4de38442cc9d8c57d87556c02f2b12655e3d662887b4cfc43'}]}, 'timestamp': '2025-10-04 05:46:02.806460', '_unique_id': '62587a86d38e4de4aa27c3c980236862'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.807 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.808 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.808 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.read.requests volume: 1147 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.808 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56e38e34-77e0-4ec6-8394-0c7175cc57c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1147, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.808243', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '699923e8-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'b15e3df0957960b1e47d6a5821d1d2bbc80af15cce3c14cb696f0c905772fe7a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.808243', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '69992fe6-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'e138cf15470361f62ae2d6e9f70f5cd7aabd3750dd9e0e6bc1f13fe4e8acd2e8'}]}, 'timestamp': '2025-10-04 05:46:02.808856', '_unique_id': 'daa1f2af134644718ea0c8869b41329a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.809 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.810 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.811 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '483b83f3-86ce-410d-ba57-651b2a14a089', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.810769', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '69998748-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.477620082, 'message_signature': 'e5205a8edcfc7c1eeb64ff47f43a2cacd5814376562f51b3d6760171c6068d2d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.810769', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699994ea-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.477620082, 'message_signature': 'e10086546d604652da84f41cf5ed0d77e767761c28ccb6039659997077623563'}]}, 'timestamp': '2025-10-04 05:46:02.811480', '_unique_id': '4c52efd02bf5438b875e7539c7285305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.812 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.813 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.834 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/cpu volume: 11290000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be219314-a7ff-4f17-be37-1a164e79d7d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11290000000, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'timestamp': '2025-10-04T05:46:02.813380', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '699d35be-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.549202041, 'message_signature': 'e98b6a24e84228a338539269f8484beadf65e740e5390c902fbd7927a27fee1c'}]}, 'timestamp': '2025-10-04 05:46:02.835337', '_unique_id': 'dd4cb7f1e75046048b612882e65d0e32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.836 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.837 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.837 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.837 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58dfe3e5-d6aa-4c45-8d0a-b07454b52e96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-vda', 'timestamp': '2025-10-04T05:46:02.837415', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '699d99c8-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': 'f9530ed8699cc404f59608ac016c5d5d1dc9df6e642d18920e255427803750ae'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0-sda', 'timestamp': '2025-10-04T05:46:02.837415', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '699da7c4-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.440451793, 'message_signature': '0c70063fc99eb5c6c39a1ce8e0e00ec72b838b0cc96fb26b330008e5ca1f7003'}]}, 'timestamp': '2025-10-04 05:46:02.838143', '_unique_id': '7e5cd07fa9fc47fcb911d31300d2a5da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.838 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.839 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.839 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f07feacb-eb53-42a1-9452-9ffd30f67d53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.839940', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '699dfa62-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': '11e566731e06f773b0cbdb5a035447419421b995885c60af9c477f87cfe6938f'}]}, 'timestamp': '2025-10-04 05:46:02.840276', '_unique_id': '99f8351847d04c5a8ae1c25445cc1388'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.840 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.841 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.841 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e65125d-11e8-4edf-9f95-b2795508a322', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.841824', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '699e430a-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': 'c0055f72f1b0df56e4970ecaf5ed26cc25e55991c251ceefe29c7f0b1ddab9c3'}]}, 'timestamp': '2025-10-04 05:46:02.842157', '_unique_id': '61605435ed7446c19365be9026f5fb91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.842 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.843 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.843 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e377854f-7ae2-4467-8f39-b2990185372f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.843895', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '699e9404-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': 'e0d3a6fae59008e015febf14045f22e09cd50d9321cf9d52c3dff9c89fab2160'}]}, 'timestamp': '2025-10-04 05:46:02.844324', '_unique_id': 'b5d651fd80334adf9ac10a25ccc02ab2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.845 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.846 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.846 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.846 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1851043624>]
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.846 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.847 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.outgoing.bytes volume: 3704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '495d6250-6bd4-4998-baa3-20bdcf37c293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3704, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.846851', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '699f0f6a-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': '1f871b07e0660205ef4fbc8fd3ef7cabb284d2af15780ba7ab699a64dd19aa51'}]}, 'timestamp': '2025-10-04 05:46:02.847368', '_unique_id': 'a453732904dc428ab58bec6004c3b354'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.848 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.849 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/network.incoming.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd652a4d7-a46c-4dcc-8bcf-96a867ba0949', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': 'instance-00000028-776da40d-5353-42f4-98f8-13b045395ff0-tap4066fc4d-fc', 'timestamp': '2025-10-04T05:46:02.849077', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'tap4066fc4d-fc', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:73:87:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4066fc4d-fc'}, 'message_id': '699f5fba-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.432084952, 'message_signature': '1f01251be0abff8cc65a3dc8fc2cf563d9166fcaddd6cfd75b12b096e0203dcf'}]}, 'timestamp': '2025-10-04 05:46:02.849423', '_unique_id': 'c0a560b6d4864e05a8a32fe946c5c95e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.850 12 DEBUG ceilometer.compute.pollsters [-] 776da40d-5353-42f4-98f8-13b045395ff0/memory.usage volume: 42.84765625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7094d36-a606-447e-93ba-2ab0a7a57990', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.84765625, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_name': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_name': None, 'resource_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'timestamp': '2025-10-04T05:46:02.850889', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1851043624', 'name': 'instance-00000028', 'instance_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'instance_type': 'm1.nano', 'host': '4ae0ce5df6eece56e483088e847cacb22a727b93cf401fb7ce576c21', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': '9585bc8c-c7a8-4928-b67c-bb6035012f8e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '2b7414ad-3419-4b92-8471-b72003f69821'}, 'image_ref': '2b7414ad-3419-4b92-8471-b72003f69821', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '699fa506-a0e5-11f0-8814-fa163ed2379c', 'monotonic_time': 4660.549202041, 'message_signature': 'e220a03bb58e0d845a28e9bd75cf1a2d20780e744dd97e2c1dedb56b06ea660d'}]}, 'timestamp': '2025-10-04 05:46:02.851206', '_unique_id': '4dc72592668c4e23b353440e376d7c9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     yield
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  4 01:46:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:46:02.851 12 ERROR oslo_messaging.notify.messaging 
Oct  4 01:46:03 np0005470441 nova_compute[192626]: 2025-10-04 05:46:03.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:04 np0005470441 podman[230430]: 2025-10-04 05:46:04.324478793 +0000 UTC m=+0.080312581 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:46:04 np0005470441 podman[230431]: 2025-10-04 05:46:04.341006589 +0000 UTC m=+0.097342181 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 01:46:04 np0005470441 nova_compute[192626]: 2025-10-04 05:46:04.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:05 np0005470441 nova_compute[192626]: 2025-10-04 05:46:05.744 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:06 np0005470441 nova_compute[192626]: 2025-10-04 05:46:06.710 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:06.753 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:06.754 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:06.755 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:07 np0005470441 nova_compute[192626]: 2025-10-04 05:46:07.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:07 np0005470441 nova_compute[192626]: 2025-10-04 05:46:07.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:46:08 np0005470441 nova_compute[192626]: 2025-10-04 05:46:08.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:09 np0005470441 nova_compute[192626]: 2025-10-04 05:46:09.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.786 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.787 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.788 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.788 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.916 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:10 np0005470441 podman[230469]: 2025-10-04 05:46:10.934465654 +0000 UTC m=+0.090177145 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.989 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:10 np0005470441 nova_compute[192626]: 2025-10-04 05:46:10.990 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.065 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.206 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.207 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5543MB free_disk=73.39177322387695GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.208 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.208 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.436 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 776da40d-5353-42f4-98f8-13b045395ff0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.437 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.437 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.484 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.548 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.776 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:46:11 np0005470441 nova_compute[192626]: 2025-10-04 05:46:11.776 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:12 np0005470441 nova_compute[192626]: 2025-10-04 05:46:12.778 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:12 np0005470441 nova_compute[192626]: 2025-10-04 05:46:12.778 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:46:12 np0005470441 nova_compute[192626]: 2025-10-04 05:46:12.778 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:46:13 np0005470441 nova_compute[192626]: 2025-10-04 05:46:13.073 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:46:13 np0005470441 nova_compute[192626]: 2025-10-04 05:46:13.073 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:46:13 np0005470441 nova_compute[192626]: 2025-10-04 05:46:13.074 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:46:13 np0005470441 nova_compute[192626]: 2025-10-04 05:46:13.074 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 776da40d-5353-42f4-98f8-13b045395ff0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:46:13 np0005470441 nova_compute[192626]: 2025-10-04 05:46:13.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:14 np0005470441 nova_compute[192626]: 2025-10-04 05:46:14.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:17 np0005470441 podman[230496]: 2025-10-04 05:46:17.337404396 +0000 UTC m=+0.081690150 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:46:18 np0005470441 nova_compute[192626]: 2025-10-04 05:46:18.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:19 np0005470441 podman[230520]: 2025-10-04 05:46:19.330132526 +0000 UTC m=+0.074360190 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:46:19 np0005470441 nova_compute[192626]: 2025-10-04 05:46:19.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:19 np0005470441 nova_compute[192626]: 2025-10-04 05:46:19.959 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updating instance_info_cache with network_info: [{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:46:20 np0005470441 nova_compute[192626]: 2025-10-04 05:46:20.012 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:46:20 np0005470441 nova_compute[192626]: 2025-10-04 05:46:20.013 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:46:20 np0005470441 nova_compute[192626]: 2025-10-04 05:46:20.013 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:20 np0005470441 nova_compute[192626]: 2025-10-04 05:46:20.013 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:20 np0005470441 nova_compute[192626]: 2025-10-04 05:46:20.013 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:20 np0005470441 nova_compute[192626]: 2025-10-04 05:46:20.013 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:46:22 np0005470441 podman[230539]: 2025-10-04 05:46:22.365448813 +0000 UTC m=+0.106081232 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_controller)
Oct  4 01:46:23 np0005470441 nova_compute[192626]: 2025-10-04 05:46:23.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:24 np0005470441 nova_compute[192626]: 2025-10-04 05:46:24.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:28 np0005470441 nova_compute[192626]: 2025-10-04 05:46:28.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:29 np0005470441 nova_compute[192626]: 2025-10-04 05:46:29.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:30 np0005470441 podman[230566]: 2025-10-04 05:46:30.297541473 +0000 UTC m=+0.050598726 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Oct  4 01:46:30 np0005470441 podman[230567]: 2025-10-04 05:46:30.302430954 +0000 UTC m=+0.046713015 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:46:33 np0005470441 nova_compute[192626]: 2025-10-04 05:46:33.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:34 np0005470441 nova_compute[192626]: 2025-10-04 05:46:34.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:46:34Z|00299|binding|INFO|Releasing lport 3011c29a-41c7-42ec-b849-284b7adb1cc8 from this chassis (sb_readonly=0)
Oct  4 01:46:35 np0005470441 nova_compute[192626]: 2025-10-04 05:46:35.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:35 np0005470441 podman[230611]: 2025-10-04 05:46:35.311837713 +0000 UTC m=+0.062121428 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:46:35 np0005470441 podman[230612]: 2025-10-04 05:46:35.319246586 +0000 UTC m=+0.065372721 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible)
Oct  4 01:46:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:35.326 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:46:35 np0005470441 nova_compute[192626]: 2025-10-04 05:46:35.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:35.327 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:46:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:35.328 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:46:38 np0005470441 nova_compute[192626]: 2025-10-04 05:46:38.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:39 np0005470441 nova_compute[192626]: 2025-10-04 05:46:39.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:39 np0005470441 nova_compute[192626]: 2025-10-04 05:46:39.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:41 np0005470441 podman[230649]: 2025-10-04 05:46:41.352946121 +0000 UTC m=+0.092879433 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Oct  4 01:46:43 np0005470441 nova_compute[192626]: 2025-10-04 05:46:43.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:44 np0005470441 nova_compute[192626]: 2025-10-04 05:46:44.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.073 2 DEBUG nova.compute.manager [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-changed-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.073 2 DEBUG nova.compute.manager [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Refreshing instance network info cache due to event network-changed-4066fc4d-fc40-4b04-b760-5c297bb4e954. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.074 2 DEBUG oslo_concurrency.lockutils [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.074 2 DEBUG oslo_concurrency.lockutils [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.074 2 DEBUG nova.network.neutron [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Refreshing network info cache for port 4066fc4d-fc40-4b04-b760-5c297bb4e954 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.161 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.161 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.162 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.162 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.162 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.163 2 INFO nova.compute.manager [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Terminating instance#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.164 2 DEBUG nova.compute.manager [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:46:45 np0005470441 kernel: tap4066fc4d-fc (unregistering): left promiscuous mode
Oct  4 01:46:45 np0005470441 NetworkManager[51690]: <info>  [1759556805.1933] device (tap4066fc4d-fc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:46:45Z|00300|binding|INFO|Releasing lport 4066fc4d-fc40-4b04-b760-5c297bb4e954 from this chassis (sb_readonly=0)
Oct  4 01:46:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:46:45Z|00301|binding|INFO|Setting lport 4066fc4d-fc40-4b04-b760-5c297bb4e954 down in Southbound
Oct  4 01:46:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:46:45Z|00302|binding|INFO|Removing iface tap4066fc4d-fc ovn-installed in OVS
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.219 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:87:2c 10.100.0.9 2001:db8::f816:3eff:fe73:872c'], port_security=['fa:16:3e:73:87:2c 10.100.0.9 2001:db8::f816:3eff:fe73:872c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8::f816:3eff:fe73:872c/64', 'neutron:device_id': '776da40d-5353-42f4-98f8-13b045395ff0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '35a5117b-c0ca-43df-9c9c-3bd64cd71b3a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fc4abf8-8cf7-4116-a194-254fe81a88ac, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=4066fc4d-fc40-4b04-b760-5c297bb4e954) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.221 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 4066fc4d-fc40-4b04-b760-5c297bb4e954 in datapath 7276fadd-7b41-4e61-aace-db1bca5ce8f0 unbound from our chassis#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.222 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7276fadd-7b41-4e61-aace-db1bca5ce8f0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.223 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[006ffc24-f623-47fd-867a-0a8fcb446c1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.224 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0 namespace which is not needed anymore#033[00m
Oct  4 01:46:45 np0005470441 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000028.scope: Deactivated successfully.
Oct  4 01:46:45 np0005470441 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000028.scope: Consumed 15.077s CPU time.
Oct  4 01:46:45 np0005470441 systemd-machined[152624]: Machine qemu-23-instance-00000028 terminated.
Oct  4 01:46:45 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [NOTICE]   (230289) : haproxy version is 2.8.14-c23fe91
Oct  4 01:46:45 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [NOTICE]   (230289) : path to executable is /usr/sbin/haproxy
Oct  4 01:46:45 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [WARNING]  (230289) : Exiting Master process...
Oct  4 01:46:45 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [WARNING]  (230289) : Exiting Master process...
Oct  4 01:46:45 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [ALERT]    (230289) : Current worker (230291) exited with code 143 (Terminated)
Oct  4 01:46:45 np0005470441 neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0[230285]: [WARNING]  (230289) : All workers exited. Exiting... (0)
Oct  4 01:46:45 np0005470441 systemd[1]: libpod-b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32.scope: Deactivated successfully.
Oct  4 01:46:45 np0005470441 podman[230695]: 2025-10-04 05:46:45.350852426 +0000 UTC m=+0.044478870 container died b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  4 01:46:45 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32-userdata-shm.mount: Deactivated successfully.
Oct  4 01:46:45 np0005470441 systemd[1]: var-lib-containers-storage-overlay-7c98ee95f3e061b611bf473f78c44c02a4be074df08f06578dd8a28e4f907180-merged.mount: Deactivated successfully.
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 podman[230695]: 2025-10-04 05:46:45.396261822 +0000 UTC m=+0.089888306 container cleanup b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 systemd[1]: libpod-conmon-b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32.scope: Deactivated successfully.
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.433 2 INFO nova.virt.libvirt.driver [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Instance destroyed successfully.#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.433 2 DEBUG nova.objects.instance [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid 776da40d-5353-42f4-98f8-13b045395ff0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.460 2 DEBUG nova.virt.libvirt.vif [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:45:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1851043624',display_name='tempest-TestGettingAddress-server-1851043624',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1851043624',id=40,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG+fumAMK4eGampQeF4hlruIcaaRBYmbhgeNl6SetwJUPy46+iiikMXv7AerAURtoVhgrIfhgBJ4psNsdbxnYSv6jDkmxEGB9TsMqxJSFd3pHcMBAZrUSncmZpDks1nzzg==',key_name='tempest-TestGettingAddress-788686438',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:45:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-jpm0oom4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:45:41Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=776da40d-5353-42f4-98f8-13b045395ff0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.460 2 DEBUG nova.network.os_vif_util [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.461 2 DEBUG nova.network.os_vif_util [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.462 2 DEBUG os_vif [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4066fc4d-fc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.469 2 DEBUG nova.compute.manager [req-58d46ce0-2d6c-46fd-9696-b26c685d4f74 req-bee9be82-4083-4a06-8fe7-5ef282acce0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-vif-unplugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.469 2 DEBUG oslo_concurrency.lockutils [req-58d46ce0-2d6c-46fd-9696-b26c685d4f74 req-bee9be82-4083-4a06-8fe7-5ef282acce0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.469 2 DEBUG oslo_concurrency.lockutils [req-58d46ce0-2d6c-46fd-9696-b26c685d4f74 req-bee9be82-4083-4a06-8fe7-5ef282acce0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.470 2 DEBUG oslo_concurrency.lockutils [req-58d46ce0-2d6c-46fd-9696-b26c685d4f74 req-bee9be82-4083-4a06-8fe7-5ef282acce0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.470 2 DEBUG nova.compute.manager [req-58d46ce0-2d6c-46fd-9696-b26c685d4f74 req-bee9be82-4083-4a06-8fe7-5ef282acce0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] No waiting events found dispatching network-vif-unplugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.470 2 DEBUG nova.compute.manager [req-58d46ce0-2d6c-46fd-9696-b26c685d4f74 req-bee9be82-4083-4a06-8fe7-5ef282acce0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-vif-unplugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 podman[230734]: 2025-10-04 05:46:45.472445084 +0000 UTC m=+0.048191908 container remove b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.473 2 INFO os_vif [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:73:87:2c,bridge_name='br-int',has_traffic_filtering=True,id=4066fc4d-fc40-4b04-b760-5c297bb4e954,network=Network(7276fadd-7b41-4e61-aace-db1bca5ce8f0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4066fc4d-fc')#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.474 2 INFO nova.virt.libvirt.driver [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Deleting instance files /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0_del#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.474 2 INFO nova.virt.libvirt.driver [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Deletion of /var/lib/nova/instances/776da40d-5353-42f4-98f8-13b045395ff0_del complete#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.477 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9a1f8e-e23a-4887-8147-e3c90b82d0d2]: (4, ('Sat Oct  4 05:46:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0 (b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32)\nb245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32\nSat Oct  4 05:46:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0 (b245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32)\nb245d01acc98c9b6230494ab172f02e2ae2418f070436017469fdfadeffe6f32\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.479 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3e8a2387-e262-4238-b438-184df730eeed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.480 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7276fadd-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:46:45 np0005470441 kernel: tap7276fadd-70: left promiscuous mode
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.497 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[035b471d-4c19-4cee-bc34-a62043a055fb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.519 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8bb84641-bf49-44c4-bfd6-b35fc1607496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.521 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aece5350-214f-401f-b303-e9251b89ca16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.526 2 INFO nova.compute.manager [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.526 2 DEBUG oslo.service.loopingcall [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.526 2 DEBUG nova.compute.manager [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:46:45 np0005470441 nova_compute[192626]: 2025-10-04 05:46:45.527 2 DEBUG nova.network.neutron [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.535 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[42927405-7941-42ad-863c-1f3ce970280a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463810, 'reachable_time': 17866, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230755, 'error': None, 'target': 'ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:45 np0005470441 systemd[1]: run-netns-ovnmeta\x2d7276fadd\x2d7b41\x2d4e61\x2daace\x2ddb1bca5ce8f0.mount: Deactivated successfully.
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.539 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7276fadd-7b41-4e61-aace-db1bca5ce8f0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:46:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:46:45.539 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[addd6a9b-8d83-4534-8ed2-1704fc30528f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.569 2 DEBUG nova.compute.manager [req-f517fb27-3d9d-45ff-9f91-3c517c25008e req-b2eb7a4d-12c6-4694-992a-65e5fb679c2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.570 2 DEBUG oslo_concurrency.lockutils [req-f517fb27-3d9d-45ff-9f91-3c517c25008e req-b2eb7a4d-12c6-4694-992a-65e5fb679c2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "776da40d-5353-42f4-98f8-13b045395ff0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.570 2 DEBUG oslo_concurrency.lockutils [req-f517fb27-3d9d-45ff-9f91-3c517c25008e req-b2eb7a4d-12c6-4694-992a-65e5fb679c2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.570 2 DEBUG oslo_concurrency.lockutils [req-f517fb27-3d9d-45ff-9f91-3c517c25008e req-b2eb7a4d-12c6-4694-992a-65e5fb679c2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.570 2 DEBUG nova.compute.manager [req-f517fb27-3d9d-45ff-9f91-3c517c25008e req-b2eb7a4d-12c6-4694-992a-65e5fb679c2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] No waiting events found dispatching network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.570 2 WARNING nova.compute.manager [req-f517fb27-3d9d-45ff-9f91-3c517c25008e req-b2eb7a4d-12c6-4694-992a-65e5fb679c2c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received unexpected event network-vif-plugged-4066fc4d-fc40-4b04-b760-5c297bb4e954 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.886 2 DEBUG nova.network.neutron [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.912 2 INFO nova.compute.manager [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Took 2.39 seconds to deallocate network for instance.#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.959 2 DEBUG nova.compute.manager [req-567178ec-77b0-4e20-9cae-ca47d3063f02 req-1a6e2577-f039-46db-a1e3-6ee23452a483 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Received event network-vif-deleted-4066fc4d-fc40-4b04-b760-5c297bb4e954 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.969 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:47 np0005470441 nova_compute[192626]: 2025-10-04 05:46:47.970 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.029 2 DEBUG nova.compute.provider_tree [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.051 2 DEBUG nova.scheduler.client.report [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.093 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.120 2 INFO nova.scheduler.client.report [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance 776da40d-5353-42f4-98f8-13b045395ff0#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.200 2 DEBUG oslo_concurrency.lockutils [None req-4df736dc-8363-4c74-b9c6-8b6d8ffee749 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "776da40d-5353-42f4-98f8-13b045395ff0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:48 np0005470441 podman[230756]: 2025-10-04 05:46:48.314995767 +0000 UTC m=+0.062092477 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.637 2 DEBUG nova.network.neutron [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updated VIF entry in instance network info cache for port 4066fc4d-fc40-4b04-b760-5c297bb4e954. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.637 2 DEBUG nova.network.neutron [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Updating instance_info_cache with network_info: [{"id": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "address": "fa:16:3e:73:87:2c", "network": {"id": "7276fadd-7b41-4e61-aace-db1bca5ce8f0", "bridge": "br-int", "label": "tempest-network-smoke--976342248", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe73:872c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4066fc4d-fc", "ovs_interfaceid": "4066fc4d-fc40-4b04-b760-5c297bb4e954", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:46:48 np0005470441 nova_compute[192626]: 2025-10-04 05:46:48.655 2 DEBUG oslo_concurrency.lockutils [req-6e54e724-d45e-42b1-b8a3-531ab8fe85d8 req-38b5cb99-cef7-4888-9905-083cba969afb 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-776da40d-5353-42f4-98f8-13b045395ff0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:46:50 np0005470441 podman[230783]: 2025-10-04 05:46:50.289785819 +0000 UTC m=+0.048472305 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  4 01:46:50 np0005470441 nova_compute[192626]: 2025-10-04 05:46:50.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:52 np0005470441 nova_compute[192626]: 2025-10-04 05:46:52.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:52 np0005470441 nova_compute[192626]: 2025-10-04 05:46:52.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:53 np0005470441 nova_compute[192626]: 2025-10-04 05:46:53.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:53 np0005470441 podman[230804]: 2025-10-04 05:46:53.329427471 +0000 UTC m=+0.076024778 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:46:55 np0005470441 nova_compute[192626]: 2025-10-04 05:46:55.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.459 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.459 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.482 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.573 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.574 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.581 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.581 2 INFO nova.compute.claims [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.707 2 DEBUG nova.compute.provider_tree [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.721 2 DEBUG nova.scheduler.client.report [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.746 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.747 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.803 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.804 2 DEBUG nova.network.neutron [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.826 2 INFO nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.846 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.933 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.935 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.936 2 INFO nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Creating image(s)#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.936 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "/var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.936 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.937 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "/var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.954 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:56 np0005470441 nova_compute[192626]: 2025-10-04 05:46:56.987 2 DEBUG nova.policy [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b2989168a314457b9d68405a2e5b9ab8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ec39d6d697445438e79b0bfc666a027', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.007 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.008 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.009 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.031 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.094 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.096 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.138 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.139 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.139 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.198 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.199 2 DEBUG nova.virt.disk.api [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Checking if we can resize image /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.200 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.291 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.294 2 DEBUG nova.virt.disk.api [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Cannot resize image /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.294 2 DEBUG nova.objects.instance [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'migration_context' on Instance uuid 4233ce36-e7ec-41b5-aff4-59969b5a18ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.329 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.330 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Ensure instance console log exists: /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.330 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.331 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:46:57 np0005470441 nova_compute[192626]: 2025-10-04 05:46:57.331 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:46:58 np0005470441 nova_compute[192626]: 2025-10-04 05:46:58.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:46:59 np0005470441 nova_compute[192626]: 2025-10-04 05:46:59.248 2 DEBUG nova.network.neutron [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Successfully created port: a31216fb-2878-4ad4-99fe-326d8971cb62 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.433 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556805.4314349, 776da40d-5353-42f4-98f8-13b045395ff0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.434 2 INFO nova.compute.manager [-] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.458 2 DEBUG nova.compute.manager [None req-51933add-d141-4297-8f03-2410a4ab306d - - - - - -] [instance: 776da40d-5353-42f4-98f8-13b045395ff0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.858 2 DEBUG nova.network.neutron [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Successfully updated port: a31216fb-2878-4ad4-99fe-326d8971cb62 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.880 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.880 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquired lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.881 2 DEBUG nova.network.neutron [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.972 2 DEBUG nova.compute.manager [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-changed-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.973 2 DEBUG nova.compute.manager [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Refreshing instance network info cache due to event network-changed-a31216fb-2878-4ad4-99fe-326d8971cb62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:47:00 np0005470441 nova_compute[192626]: 2025-10-04 05:47:00.973 2 DEBUG oslo_concurrency.lockutils [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:47:01 np0005470441 nova_compute[192626]: 2025-10-04 05:47:01.055 2 DEBUG nova.network.neutron [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:47:01 np0005470441 podman[230845]: 2025-10-04 05:47:01.316882441 +0000 UTC m=+0.068787189 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  4 01:47:01 np0005470441 podman[230846]: 2025-10-04 05:47:01.366342854 +0000 UTC m=+0.099385260 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.067 2 DEBUG nova.network.neutron [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updating instance_info_cache with network_info: [{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.088 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Releasing lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.089 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Instance network_info: |[{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.090 2 DEBUG oslo_concurrency.lockutils [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.090 2 DEBUG nova.network.neutron [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Refreshing network info cache for port a31216fb-2878-4ad4-99fe-326d8971cb62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.095 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Start _get_guest_xml network_info=[{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.102 2 WARNING nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.108 2 DEBUG nova.virt.libvirt.host [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.109 2 DEBUG nova.virt.libvirt.host [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.119 2 DEBUG nova.virt.libvirt.host [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.120 2 DEBUG nova.virt.libvirt.host [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.122 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.122 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.123 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.123 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.124 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.124 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.125 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.125 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.126 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.126 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.127 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.127 2 DEBUG nova.virt.hardware [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.133 2 DEBUG nova.virt.libvirt.vif [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:46:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1165607435',display_name='tempest-TestNetworkBasicOps-server-1165607435',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1165607435',id=43,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIWlsvBeGeS9/pEzPjLNbP2Af5gbK0pEDhDhzs4f6aFsrWJkBtoP1d0SH2ByuqXjD3NsUKuBAyq04b7A8+th0w4wg75EKN7D9++H7bmQadj14BJQEzSTu0VyyhLTAOcCEQ==',key_name='tempest-TestNetworkBasicOps-1404055165',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-7dtrxtto',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:46:56Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=4233ce36-e7ec-41b5-aff4-59969b5a18ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.134 2 DEBUG nova.network.os_vif_util [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.135 2 DEBUG nova.network.os_vif_util [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.137 2 DEBUG nova.objects.instance [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4233ce36-e7ec-41b5-aff4-59969b5a18ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.155 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <uuid>4233ce36-e7ec-41b5-aff4-59969b5a18ed</uuid>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <name>instance-0000002b</name>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestNetworkBasicOps-server-1165607435</nova:name>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:47:02</nova:creationTime>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:user uuid="b2989168a314457b9d68405a2e5b9ab8">tempest-TestNetworkBasicOps-600174410-project-member</nova:user>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:project uuid="7ec39d6d697445438e79b0bfc666a027">tempest-TestNetworkBasicOps-600174410</nova:project>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        <nova:port uuid="a31216fb-2878-4ad4-99fe-326d8971cb62">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <entry name="serial">4233ce36-e7ec-41b5-aff4-59969b5a18ed</entry>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <entry name="uuid">4233ce36-e7ec-41b5-aff4-59969b5a18ed</entry>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.config"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:a2:b1:00"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <target dev="tapa31216fb-28"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/console.log" append="off"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:47:02 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:47:02 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:47:02 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:47:02 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.158 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Preparing to wait for external event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.158 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.159 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.159 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.160 2 DEBUG nova.virt.libvirt.vif [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:46:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1165607435',display_name='tempest-TestNetworkBasicOps-server-1165607435',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1165607435',id=43,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIWlsvBeGeS9/pEzPjLNbP2Af5gbK0pEDhDhzs4f6aFsrWJkBtoP1d0SH2ByuqXjD3NsUKuBAyq04b7A8+th0w4wg75EKN7D9++H7bmQadj14BJQEzSTu0VyyhLTAOcCEQ==',key_name='tempest-TestNetworkBasicOps-1404055165',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-7dtrxtto',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:46:56Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=4233ce36-e7ec-41b5-aff4-59969b5a18ed,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.161 2 DEBUG nova.network.os_vif_util [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.162 2 DEBUG nova.network.os_vif_util [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.162 2 DEBUG os_vif [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.164 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.164 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.169 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa31216fb-28, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.169 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa31216fb-28, col_values=(('external_ids', {'iface-id': 'a31216fb-2878-4ad4-99fe-326d8971cb62', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:b1:00', 'vm-uuid': '4233ce36-e7ec-41b5-aff4-59969b5a18ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:02 np0005470441 NetworkManager[51690]: <info>  [1759556822.1723] manager: (tapa31216fb-28): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.181 2 INFO os_vif [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28')#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.247 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.247 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.248 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] No VIF found with MAC fa:16:3e:a2:b1:00, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:47:02 np0005470441 nova_compute[192626]: 2025-10-04 05:47:02.248 2 INFO nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Using config drive#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.169 2 INFO nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Creating config drive at /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.config#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.177 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo24blpc4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.304 2 DEBUG oslo_concurrency.processutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo24blpc4" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:47:03 np0005470441 kernel: tapa31216fb-28: entered promiscuous mode
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 NetworkManager[51690]: <info>  [1759556823.3801] manager: (tapa31216fb-28): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Oct  4 01:47:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:03Z|00303|binding|INFO|Claiming lport a31216fb-2878-4ad4-99fe-326d8971cb62 for this chassis.
Oct  4 01:47:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:03Z|00304|binding|INFO|a31216fb-2878-4ad4-99fe-326d8971cb62: Claiming fa:16:3e:a2:b1:00 10.100.0.11
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.397 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:b1:00 10.100.0.11'], port_security=['fa:16:3e:a2:b1:00 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4233ce36-e7ec-41b5-aff4-59969b5a18ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f0c855-540f-4888-917f-7ede262ace90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8af7f447-9bdb-47f2-bf14-ef768c8dd981', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0206e0b6-0a94-4177-8673-c705f12dc5d7, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a31216fb-2878-4ad4-99fe-326d8971cb62) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.399 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a31216fb-2878-4ad4-99fe-326d8971cb62 in datapath d9f0c855-540f-4888-917f-7ede262ace90 bound to our chassis#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.401 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d9f0c855-540f-4888-917f-7ede262ace90#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.418 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[193d9b95-0d3d-427d-b0cb-e87196f8f17d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.419 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd9f0c855-51 in ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.421 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd9f0c855-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.421 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[942dc677-4a59-459a-92db-7caa7be9c370]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.423 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c3307647-30db-4e09-98fb-47e51bc232c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 systemd-machined[152624]: New machine qemu-24-instance-0000002b.
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.438 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[fca2e47d-8981-4957-8a54-1646d207bef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.464 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5ab4dab9-1df1-4e1c-869b-0f4bcf2c3357]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 systemd[1]: Started Virtual Machine qemu-24-instance-0000002b.
Oct  4 01:47:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:03Z|00305|binding|INFO|Setting lport a31216fb-2878-4ad4-99fe-326d8971cb62 ovn-installed in OVS
Oct  4 01:47:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:03Z|00306|binding|INFO|Setting lport a31216fb-2878-4ad4-99fe-326d8971cb62 up in Southbound
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 systemd-udevd[230913]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:47:03 np0005470441 NetworkManager[51690]: <info>  [1759556823.4947] device (tapa31216fb-28): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:47:03 np0005470441 NetworkManager[51690]: <info>  [1759556823.4953] device (tapa31216fb-28): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.510 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[8a0f5e32-ad20-4aca-9a36-62eb588774dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.515 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a2493804-947d-4a8e-bbf7-13d21e54e296]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 systemd-udevd[230916]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:47:03 np0005470441 NetworkManager[51690]: <info>  [1759556823.5166] manager: (tapd9f0c855-50): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.558 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[eeba8d19-a858-4c12-83f3-56ae4a68b7ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.562 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[bc93b9ea-1d8c-4373-9350-9824ee7b554d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 NetworkManager[51690]: <info>  [1759556823.5943] device (tapd9f0c855-50): carrier: link connected
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.603 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[d87f6b91-dae3-4f49-a0a7-4e09ca1e47ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.626 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e39d55fc-3251-4e5a-ad1c-1972a0c76e05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f0c855-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:a6:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472125, 'reachable_time': 44712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230942, 'error': None, 'target': 'ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.647 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[105ba4f9-36e6-4350-98c8-04e389018c78]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:a657'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 472125, 'tstamp': 472125}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230943, 'error': None, 'target': 'ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.668 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1c210360-1025-4396-9ca1-0b31a6718df4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd9f0c855-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:a6:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472125, 'reachable_time': 44712, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230944, 'error': None, 'target': 'ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.702 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[17c4d448-1269-4342-aaad-b2c3fbe1485f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.778 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2e7b9c75-41b8-42d8-9363-0b6e1c869dae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.780 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f0c855-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.780 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.781 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd9f0c855-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 kernel: tapd9f0c855-50: entered promiscuous mode
Oct  4 01:47:03 np0005470441 NetworkManager[51690]: <info>  [1759556823.7847] manager: (tapd9f0c855-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.792 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd9f0c855-50, col_values=(('external_ids', {'iface-id': '29b885b4-8a51-4780-a849-525802b3b5cd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:03Z|00307|binding|INFO|Releasing lport 29b885b4-8a51-4780-a849-525802b3b5cd from this chassis (sb_readonly=0)
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.797 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d9f0c855-540f-4888-917f-7ede262ace90.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d9f0c855-540f-4888-917f-7ede262ace90.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.798 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7aef6be0-55ec-447b-9418-97c33127011c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.799 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-d9f0c855-540f-4888-917f-7ede262ace90
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/d9f0c855-540f-4888-917f-7ede262ace90.pid.haproxy
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID d9f0c855-540f-4888-917f-7ede262ace90
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:47:03 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:03.800 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90', 'env', 'PROCESS_TAG=haproxy-d9f0c855-540f-4888-917f-7ede262ace90', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d9f0c855-540f-4888-917f-7ede262ace90.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:47:03 np0005470441 nova_compute[192626]: 2025-10-04 05:47:03.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.207 2 DEBUG nova.compute.manager [req-96220f94-ea40-4bbe-bd91-8cc1573f2c01 req-3263e514-3109-45d3-b4c1-b7cda776b745 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.208 2 DEBUG oslo_concurrency.lockutils [req-96220f94-ea40-4bbe-bd91-8cc1573f2c01 req-3263e514-3109-45d3-b4c1-b7cda776b745 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.209 2 DEBUG oslo_concurrency.lockutils [req-96220f94-ea40-4bbe-bd91-8cc1573f2c01 req-3263e514-3109-45d3-b4c1-b7cda776b745 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.209 2 DEBUG oslo_concurrency.lockutils [req-96220f94-ea40-4bbe-bd91-8cc1573f2c01 req-3263e514-3109-45d3-b4c1-b7cda776b745 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.209 2 DEBUG nova.compute.manager [req-96220f94-ea40-4bbe-bd91-8cc1573f2c01 req-3263e514-3109-45d3-b4c1-b7cda776b745 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Processing event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:47:04 np0005470441 podman[230976]: 2025-10-04 05:47:04.250241656 +0000 UTC m=+0.089950549 container create 37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:47:04 np0005470441 podman[230976]: 2025-10-04 05:47:04.200388052 +0000 UTC m=+0.040097005 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:47:04 np0005470441 systemd[1]: Started libpod-conmon-37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257.scope.
Oct  4 01:47:04 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:47:04 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c4a677f4caad4733150b8bd26d266d57deca54a444e46abfa1755a75d3d1213/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.351 2 DEBUG nova.network.neutron [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updated VIF entry in instance network info cache for port a31216fb-2878-4ad4-99fe-326d8971cb62. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.352 2 DEBUG nova.network.neutron [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updating instance_info_cache with network_info: [{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:47:04 np0005470441 podman[230976]: 2025-10-04 05:47:04.362208157 +0000 UTC m=+0.201917100 container init 37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:47:04 np0005470441 podman[230976]: 2025-10-04 05:47:04.372446571 +0000 UTC m=+0.212155454 container start 37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  4 01:47:04 np0005470441 nova_compute[192626]: 2025-10-04 05:47:04.374 2 DEBUG oslo_concurrency.lockutils [req-2284681f-32f0-4863-bf3b-a6455ba05547 req-a123b1d7-04c7-465b-9dfb-065c31aaca76 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:47:04 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [NOTICE]   (230995) : New worker (230999) forked
Oct  4 01:47:04 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [NOTICE]   (230995) : Loading success.
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.005 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556825.0052664, 4233ce36-e7ec-41b5-aff4-59969b5a18ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.006 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] VM Started (Lifecycle Event)#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.008 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.013 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.018 2 INFO nova.virt.libvirt.driver [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Instance spawned successfully.#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.019 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.031 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.042 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.050 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.051 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.052 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.052 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.053 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.053 2 DEBUG nova.virt.libvirt.driver [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.088 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.089 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556825.006348, 4233ce36-e7ec-41b5-aff4-59969b5a18ed => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.090 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.120 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.125 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556825.0116415, 4233ce36-e7ec-41b5-aff4-59969b5a18ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.126 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.132 2 INFO nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Took 8.20 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.132 2 DEBUG nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.148 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.153 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.176 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.211 2 INFO nova.compute.manager [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Took 8.66 seconds to build instance.#033[00m
Oct  4 01:47:05 np0005470441 nova_compute[192626]: 2025-10-04 05:47:05.229 2 DEBUG oslo_concurrency.lockutils [None req-324b66a3-af31-4272-9ffc-b75bbe4e6d47 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.294 2 DEBUG nova.compute.manager [req-57c477b1-1087-4da4-b768-c11cf252eaaf req-bcfef6b5-4e70-44dd-8c5d-ebdb16b215b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.294 2 DEBUG oslo_concurrency.lockutils [req-57c477b1-1087-4da4-b768-c11cf252eaaf req-bcfef6b5-4e70-44dd-8c5d-ebdb16b215b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.294 2 DEBUG oslo_concurrency.lockutils [req-57c477b1-1087-4da4-b768-c11cf252eaaf req-bcfef6b5-4e70-44dd-8c5d-ebdb16b215b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.294 2 DEBUG oslo_concurrency.lockutils [req-57c477b1-1087-4da4-b768-c11cf252eaaf req-bcfef6b5-4e70-44dd-8c5d-ebdb16b215b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.295 2 DEBUG nova.compute.manager [req-57c477b1-1087-4da4-b768-c11cf252eaaf req-bcfef6b5-4e70-44dd-8c5d-ebdb16b215b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] No waiting events found dispatching network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.295 2 WARNING nova.compute.manager [req-57c477b1-1087-4da4-b768-c11cf252eaaf req-bcfef6b5-4e70-44dd-8c5d-ebdb16b215b2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received unexpected event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:47:06 np0005470441 podman[231013]: 2025-10-04 05:47:06.303309731 +0000 UTC m=+0.055919279 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible)
Oct  4 01:47:06 np0005470441 podman[231014]: 2025-10-04 05:47:06.318477387 +0000 UTC m=+0.068316066 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  4 01:47:06 np0005470441 nova_compute[192626]: 2025-10-04 05:47:06.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:06.755 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:06.756 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:06.757 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:07 np0005470441 nova_compute[192626]: 2025-10-04 05:47:07.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:07 np0005470441 nova_compute[192626]: 2025-10-04 05:47:07.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:07 np0005470441 nova_compute[192626]: 2025-10-04 05:47:07.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:07 np0005470441 nova_compute[192626]: 2025-10-04 05:47:07.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:47:08 np0005470441 nova_compute[192626]: 2025-10-04 05:47:08.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:09 np0005470441 NetworkManager[51690]: <info>  [1759556829.5676] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Oct  4 01:47:09 np0005470441 NetworkManager[51690]: <info>  [1759556829.5687] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:09 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:09Z|00308|binding|INFO|Releasing lport 29b885b4-8a51-4780-a849-525802b3b5cd from this chassis (sb_readonly=0)
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.981 2 DEBUG nova.compute.manager [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-changed-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.981 2 DEBUG nova.compute.manager [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Refreshing instance network info cache due to event network-changed-a31216fb-2878-4ad4-99fe-326d8971cb62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.981 2 DEBUG oslo_concurrency.lockutils [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.982 2 DEBUG oslo_concurrency.lockutils [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:47:09 np0005470441 nova_compute[192626]: 2025-10-04 05:47:09.982 2 DEBUG nova.network.neutron [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Refreshing network info cache for port a31216fb-2878-4ad4-99fe-326d8971cb62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:47:10 np0005470441 nova_compute[192626]: 2025-10-04 05:47:10.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.366 2 DEBUG nova.network.neutron [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updated VIF entry in instance network info cache for port a31216fb-2878-4ad4-99fe-326d8971cb62. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.367 2 DEBUG nova.network.neutron [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updating instance_info_cache with network_info: [{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.398 2 DEBUG oslo_concurrency.lockutils [req-ce068c09-14dd-4d20-a904-8f4a7d21a1ed req-b727fe75-ddd7-4fe6-8f70-aae63d46640d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.904 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.904 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.904 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:47:11 np0005470441 nova_compute[192626]: 2025-10-04 05:47:11.905 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4233ce36-e7ec-41b5-aff4-59969b5a18ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:47:12 np0005470441 nova_compute[192626]: 2025-10-04 05:47:12.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:12 np0005470441 podman[231053]: 2025-10-04 05:47:12.321894209 +0000 UTC m=+0.077469030 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  4 01:47:13 np0005470441 nova_compute[192626]: 2025-10-04 05:47:13.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.174 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updating instance_info_cache with network_info: [{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.191 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.192 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.193 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.193 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.216 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.217 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.217 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.218 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.308 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.395 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.397 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.466 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.639 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.640 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5539MB free_disk=73.41965866088867GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.640 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.640 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.720 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 4233ce36-e7ec-41b5-aff4-59969b5a18ed actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.720 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.721 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.739 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.757 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.757 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.773 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.796 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.839 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.857 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.879 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:47:14 np0005470441 nova_compute[192626]: 2025-10-04 05:47:14.880 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:15 np0005470441 nova_compute[192626]: 2025-10-04 05:47:15.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:15 np0005470441 nova_compute[192626]: 2025-10-04 05:47:15.404 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:16Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:b1:00 10.100.0.11
Oct  4 01:47:16 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:16Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:b1:00 10.100.0.11
Oct  4 01:47:16 np0005470441 nova_compute[192626]: 2025-10-04 05:47:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:17 np0005470441 nova_compute[192626]: 2025-10-04 05:47:17.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:17 np0005470441 nova_compute[192626]: 2025-10-04 05:47:17.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:18 np0005470441 nova_compute[192626]: 2025-10-04 05:47:18.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:18 np0005470441 nova_compute[192626]: 2025-10-04 05:47:18.726 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:47:19 np0005470441 podman[231096]: 2025-10-04 05:47:19.308458321 +0000 UTC m=+0.062757016 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:47:21 np0005470441 podman[231120]: 2025-10-04 05:47:21.301872079 +0000 UTC m=+0.056740843 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:47:22 np0005470441 nova_compute[192626]: 2025-10-04 05:47:22.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:22 np0005470441 nova_compute[192626]: 2025-10-04 05:47:22.436 2 INFO nova.compute.manager [None req-2d49c69f-eed9-4826-8755-96b8c4efee69 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Get console output#033[00m
Oct  4 01:47:22 np0005470441 nova_compute[192626]: 2025-10-04 05:47:22.445 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:47:23 np0005470441 nova_compute[192626]: 2025-10-04 05:47:23.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:23Z|00309|binding|INFO|Releasing lport 29b885b4-8a51-4780-a849-525802b3b5cd from this chassis (sb_readonly=0)
Oct  4 01:47:23 np0005470441 nova_compute[192626]: 2025-10-04 05:47:23.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:23 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:23Z|00310|binding|INFO|Releasing lport 29b885b4-8a51-4780-a849-525802b3b5cd from this chassis (sb_readonly=0)
Oct  4 01:47:23 np0005470441 nova_compute[192626]: 2025-10-04 05:47:23.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:24 np0005470441 podman[231140]: 2025-10-04 05:47:24.344152618 +0000 UTC m=+0.102982244 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_controller)
Oct  4 01:47:24 np0005470441 nova_compute[192626]: 2025-10-04 05:47:24.911 2 INFO nova.compute.manager [None req-b40c5cac-5c91-4bd4-8452-a78ae9a37614 b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Get console output#033[00m
Oct  4 01:47:24 np0005470441 nova_compute[192626]: 2025-10-04 05:47:24.916 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:47:26 np0005470441 NetworkManager[51690]: <info>  [1759556846.2022] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Oct  4 01:47:26 np0005470441 nova_compute[192626]: 2025-10-04 05:47:26.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:26 np0005470441 NetworkManager[51690]: <info>  [1759556846.2035] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Oct  4 01:47:26 np0005470441 nova_compute[192626]: 2025-10-04 05:47:26.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:26 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:26Z|00311|binding|INFO|Releasing lport 29b885b4-8a51-4780-a849-525802b3b5cd from this chassis (sb_readonly=0)
Oct  4 01:47:26 np0005470441 nova_compute[192626]: 2025-10-04 05:47:26.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:26 np0005470441 nova_compute[192626]: 2025-10-04 05:47:26.974 2 INFO nova.compute.manager [None req-8618885c-68e0-43da-ac39-3ae499c2561f b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Get console output#033[00m
Oct  4 01:47:26 np0005470441 nova_compute[192626]: 2025-10-04 05:47:26.980 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  4 01:47:27 np0005470441 nova_compute[192626]: 2025-10-04 05:47:27.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.165 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.166 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.166 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.167 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.167 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.170 2 INFO nova.compute.manager [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Terminating instance#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.172 2 DEBUG nova.compute.manager [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:47:28 np0005470441 kernel: tapa31216fb-28 (unregistering): left promiscuous mode
Oct  4 01:47:28 np0005470441 NetworkManager[51690]: <info>  [1759556848.1966] device (tapa31216fb-28): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:47:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:28Z|00312|binding|INFO|Releasing lport a31216fb-2878-4ad4-99fe-326d8971cb62 from this chassis (sb_readonly=0)
Oct  4 01:47:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:28Z|00313|binding|INFO|Setting lport a31216fb-2878-4ad4-99fe-326d8971cb62 down in Southbound
Oct  4 01:47:28 np0005470441 ovn_controller[94840]: 2025-10-04T05:47:28Z|00314|binding|INFO|Removing iface tapa31216fb-28 ovn-installed in OVS
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.230 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:b1:00 10.100.0.11'], port_security=['fa:16:3e:a2:b1:00 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4233ce36-e7ec-41b5-aff4-59969b5a18ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9f0c855-540f-4888-917f-7ede262ace90', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ec39d6d697445438e79b0bfc666a027', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8af7f447-9bdb-47f2-bf14-ef768c8dd981', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0206e0b6-0a94-4177-8673-c705f12dc5d7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a31216fb-2878-4ad4-99fe-326d8971cb62) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.231 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a31216fb-2878-4ad4-99fe-326d8971cb62 in datapath d9f0c855-540f-4888-917f-7ede262ace90 unbound from our chassis#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.232 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9f0c855-540f-4888-917f-7ede262ace90, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.233 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[906707e6-22c7-4f1d-8333-0b3bf8362e44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.234 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90 namespace which is not needed anymore#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Oct  4 01:47:28 np0005470441 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000002b.scope: Consumed 14.027s CPU time.
Oct  4 01:47:28 np0005470441 systemd-machined[152624]: Machine qemu-24-instance-0000002b terminated.
Oct  4 01:47:28 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [NOTICE]   (230995) : haproxy version is 2.8.14-c23fe91
Oct  4 01:47:28 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [NOTICE]   (230995) : path to executable is /usr/sbin/haproxy
Oct  4 01:47:28 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [WARNING]  (230995) : Exiting Master process...
Oct  4 01:47:28 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [WARNING]  (230995) : Exiting Master process...
Oct  4 01:47:28 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [ALERT]    (230995) : Current worker (230999) exited with code 143 (Terminated)
Oct  4 01:47:28 np0005470441 neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90[230991]: [WARNING]  (230995) : All workers exited. Exiting... (0)
Oct  4 01:47:28 np0005470441 systemd[1]: libpod-37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257.scope: Deactivated successfully.
Oct  4 01:47:28 np0005470441 podman[231190]: 2025-10-04 05:47:28.359950976 +0000 UTC m=+0.042679969 container died 37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  4 01:47:28 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257-userdata-shm.mount: Deactivated successfully.
Oct  4 01:47:28 np0005470441 systemd[1]: var-lib-containers-storage-overlay-0c4a677f4caad4733150b8bd26d266d57deca54a444e46abfa1755a75d3d1213-merged.mount: Deactivated successfully.
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 podman[231190]: 2025-10-04 05:47:28.402122279 +0000 UTC m=+0.084851252 container cleanup 37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  4 01:47:28 np0005470441 systemd[1]: libpod-conmon-37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257.scope: Deactivated successfully.
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.439 2 INFO nova.virt.libvirt.driver [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Instance destroyed successfully.#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.440 2 DEBUG nova.objects.instance [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lazy-loading 'resources' on Instance uuid 4233ce36-e7ec-41b5-aff4-59969b5a18ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.466 2 DEBUG nova.virt.libvirt.vif [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:46:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1165607435',display_name='tempest-TestNetworkBasicOps-server-1165607435',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1165607435',id=43,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIWlsvBeGeS9/pEzPjLNbP2Af5gbK0pEDhDhzs4f6aFsrWJkBtoP1d0SH2ByuqXjD3NsUKuBAyq04b7A8+th0w4wg75EKN7D9++H7bmQadj14BJQEzSTu0VyyhLTAOcCEQ==',key_name='tempest-TestNetworkBasicOps-1404055165',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:47:05Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ec39d6d697445438e79b0bfc666a027',ramdisk_id='',reservation_id='r-7dtrxtto',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-600174410',owner_user_name='tempest-TestNetworkBasicOps-600174410-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:47:05Z,user_data=None,user_id='b2989168a314457b9d68405a2e5b9ab8',uuid=4233ce36-e7ec-41b5-aff4-59969b5a18ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.467 2 DEBUG nova.network.os_vif_util [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converting VIF {"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.468 2 DEBUG nova.network.os_vif_util [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.469 2 DEBUG os_vif [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.471 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa31216fb-28, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.476 2 INFO os_vif [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:b1:00,bridge_name='br-int',has_traffic_filtering=True,id=a31216fb-2878-4ad4-99fe-326d8971cb62,network=Network(d9f0c855-540f-4888-917f-7ede262ace90),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa31216fb-28')#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.477 2 INFO nova.virt.libvirt.driver [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Deleting instance files /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed_del#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.478 2 INFO nova.virt.libvirt.driver [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Deletion of /var/lib/nova/instances/4233ce36-e7ec-41b5-aff4-59969b5a18ed_del complete#033[00m
Oct  4 01:47:28 np0005470441 podman[231230]: 2025-10-04 05:47:28.485928249 +0000 UTC m=+0.056441214 container remove 37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.490 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7bff7e52-6efe-4afd-ba17-05b633e29491]: (4, ('Sat Oct  4 05:47:28 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90 (37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257)\n37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257\nSat Oct  4 05:47:28 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90 (37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257)\n37ceb98ee813f4b028073a213d7911cdce48c702d214d83676f5746e5fd2f257\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.492 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[167c7a28-c919-4dea-8b17-8dfe35533d77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.492 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd9f0c855-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 kernel: tapd9f0c855-50: left promiscuous mode
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.520 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[025e9e89-75ee-4017-aa43-3302793a2de3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.538 2 INFO nova.compute.manager [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.538 2 DEBUG oslo.service.loopingcall [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.539 2 DEBUG nova.compute.manager [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:47:28 np0005470441 nova_compute[192626]: 2025-10-04 05:47:28.539 2 DEBUG nova.network.neutron [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.549 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[77a2acfe-8638-494c-b539-62c0b5ded13b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.551 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8ecac1a1-55ff-4023-907d-1354203c8502]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.566 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e27ec0-9301-4269-b1bd-482481284569]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 472115, 'reachable_time': 21142, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231251, 'error': None, 'target': 'ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.568 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d9f0c855-540f-4888-917f-7ede262ace90 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:47:28 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:28.569 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4d51cb-7c38-43d9-85ea-1e697c89e04d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:47:28 np0005470441 systemd[1]: run-netns-ovnmeta\x2dd9f0c855\x2d540f\x2d4888\x2d917f\x2d7ede262ace90.mount: Deactivated successfully.
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.088 2 DEBUG nova.compute.manager [req-c2932ecb-60b8-4ada-a2be-f1da89d6455f req-a087215f-8be6-498d-9f60-9af828928273 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-vif-unplugged-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.089 2 DEBUG oslo_concurrency.lockutils [req-c2932ecb-60b8-4ada-a2be-f1da89d6455f req-a087215f-8be6-498d-9f60-9af828928273 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.090 2 DEBUG oslo_concurrency.lockutils [req-c2932ecb-60b8-4ada-a2be-f1da89d6455f req-a087215f-8be6-498d-9f60-9af828928273 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.090 2 DEBUG oslo_concurrency.lockutils [req-c2932ecb-60b8-4ada-a2be-f1da89d6455f req-a087215f-8be6-498d-9f60-9af828928273 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.091 2 DEBUG nova.compute.manager [req-c2932ecb-60b8-4ada-a2be-f1da89d6455f req-a087215f-8be6-498d-9f60-9af828928273 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] No waiting events found dispatching network-vif-unplugged-a31216fb-2878-4ad4-99fe-326d8971cb62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.091 2 DEBUG nova.compute.manager [req-c2932ecb-60b8-4ada-a2be-f1da89d6455f req-a087215f-8be6-498d-9f60-9af828928273 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-vif-unplugged-a31216fb-2878-4ad4-99fe-326d8971cb62 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.256 2 DEBUG nova.compute.manager [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-changed-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.257 2 DEBUG nova.compute.manager [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Refreshing instance network info cache due to event network-changed-a31216fb-2878-4ad4-99fe-326d8971cb62. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.258 2 DEBUG oslo_concurrency.lockutils [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.259 2 DEBUG oslo_concurrency.lockutils [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:47:29 np0005470441 nova_compute[192626]: 2025-10-04 05:47:29.259 2 DEBUG nova.network.neutron [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Refreshing network info cache for port a31216fb-2878-4ad4-99fe-326d8971cb62 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:47:31 np0005470441 nova_compute[192626]: 2025-10-04 05:47:31.759 2 DEBUG nova.compute.manager [req-d94011d2-ce60-4dc8-9a2b-1e0e35fb63e8 req-c510adea-dc00-44ae-bdc5-02294c04ae61 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:31 np0005470441 nova_compute[192626]: 2025-10-04 05:47:31.759 2 DEBUG oslo_concurrency.lockutils [req-d94011d2-ce60-4dc8-9a2b-1e0e35fb63e8 req-c510adea-dc00-44ae-bdc5-02294c04ae61 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:31 np0005470441 nova_compute[192626]: 2025-10-04 05:47:31.760 2 DEBUG oslo_concurrency.lockutils [req-d94011d2-ce60-4dc8-9a2b-1e0e35fb63e8 req-c510adea-dc00-44ae-bdc5-02294c04ae61 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:31 np0005470441 nova_compute[192626]: 2025-10-04 05:47:31.760 2 DEBUG oslo_concurrency.lockutils [req-d94011d2-ce60-4dc8-9a2b-1e0e35fb63e8 req-c510adea-dc00-44ae-bdc5-02294c04ae61 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:31 np0005470441 nova_compute[192626]: 2025-10-04 05:47:31.760 2 DEBUG nova.compute.manager [req-d94011d2-ce60-4dc8-9a2b-1e0e35fb63e8 req-c510adea-dc00-44ae-bdc5-02294c04ae61 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] No waiting events found dispatching network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:47:31 np0005470441 nova_compute[192626]: 2025-10-04 05:47:31.761 2 WARNING nova.compute.manager [req-d94011d2-ce60-4dc8-9a2b-1e0e35fb63e8 req-c510adea-dc00-44ae-bdc5-02294c04ae61 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received unexpected event network-vif-plugged-a31216fb-2878-4ad4-99fe-326d8971cb62 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.048 2 DEBUG nova.network.neutron [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.074 2 INFO nova.compute.manager [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Took 3.54 seconds to deallocate network for instance.#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.103 2 DEBUG nova.compute.manager [req-2f2686ac-b046-43d6-a2c9-ad6e7055bfd4 req-824f35b0-91e3-4ad6-9654-f1ccbc42d7c3 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Received event network-vif-deleted-a31216fb-2878-4ad4-99fe-326d8971cb62 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.134 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.134 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.205 2 DEBUG nova.compute.provider_tree [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.223 2 DEBUG nova.scheduler.client.report [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.248 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.277 2 INFO nova.scheduler.client.report [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Deleted allocations for instance 4233ce36-e7ec-41b5-aff4-59969b5a18ed#033[00m
Oct  4 01:47:32 np0005470441 podman[231253]: 2025-10-04 05:47:32.334685305 +0000 UTC m=+0.070008205 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:47:32 np0005470441 podman[231252]: 2025-10-04 05:47:32.356326118 +0000 UTC m=+0.089407003 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:47:32 np0005470441 nova_compute[192626]: 2025-10-04 05:47:32.362 2 DEBUG oslo_concurrency.lockutils [None req-b3b46007-97a0-46b3-9f70-7b77215f708b b2989168a314457b9d68405a2e5b9ab8 7ec39d6d697445438e79b0bfc666a027 - - default default] Lock "4233ce36-e7ec-41b5-aff4-59969b5a18ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:47:33 np0005470441 nova_compute[192626]: 2025-10-04 05:47:33.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:33 np0005470441 nova_compute[192626]: 2025-10-04 05:47:33.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:34 np0005470441 nova_compute[192626]: 2025-10-04 05:47:34.167 2 DEBUG nova.network.neutron [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updated VIF entry in instance network info cache for port a31216fb-2878-4ad4-99fe-326d8971cb62. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:47:34 np0005470441 nova_compute[192626]: 2025-10-04 05:47:34.168 2 DEBUG nova.network.neutron [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Updating instance_info_cache with network_info: [{"id": "a31216fb-2878-4ad4-99fe-326d8971cb62", "address": "fa:16:3e:a2:b1:00", "network": {"id": "d9f0c855-540f-4888-917f-7ede262ace90", "bridge": "br-int", "label": "tempest-network-smoke--1887011077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ec39d6d697445438e79b0bfc666a027", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa31216fb-28", "ovs_interfaceid": "a31216fb-2878-4ad4-99fe-326d8971cb62", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:47:34 np0005470441 nova_compute[192626]: 2025-10-04 05:47:34.199 2 DEBUG oslo_concurrency.lockutils [req-429b3371-7cfc-4486-a7b2-9b1400b6cda8 req-ac763beb-1ca4-4707-b67d-afb7c836a7da 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-4233ce36-e7ec-41b5-aff4-59969b5a18ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:47:36 np0005470441 nova_compute[192626]: 2025-10-04 05:47:36.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:36.413 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:47:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:36.415 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:47:37 np0005470441 podman[231293]: 2025-10-04 05:47:37.332322056 +0000 UTC m=+0.078383706 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm)
Oct  4 01:47:37 np0005470441 podman[231292]: 2025-10-04 05:47:37.332730837 +0000 UTC m=+0.075853852 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:47:38 np0005470441 nova_compute[192626]: 2025-10-04 05:47:38.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:38 np0005470441 nova_compute[192626]: 2025-10-04 05:47:38.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:38 np0005470441 nova_compute[192626]: 2025-10-04 05:47:38.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:38 np0005470441 nova_compute[192626]: 2025-10-04 05:47:38.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:43 np0005470441 nova_compute[192626]: 2025-10-04 05:47:43.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:43 np0005470441 podman[231333]: 2025-10-04 05:47:43.313913819 +0000 UTC m=+0.063909739 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  4 01:47:43 np0005470441 nova_compute[192626]: 2025-10-04 05:47:43.436 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556848.4356496, 4233ce36-e7ec-41b5-aff4-59969b5a18ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:47:43 np0005470441 nova_compute[192626]: 2025-10-04 05:47:43.437 2 INFO nova.compute.manager [-] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:47:43 np0005470441 nova_compute[192626]: 2025-10-04 05:47:43.468 2 DEBUG nova.compute.manager [None req-69a063b9-78a9-4eba-bb81-bae255373b86 - - - - - -] [instance: 4233ce36-e7ec-41b5-aff4-59969b5a18ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:47:43 np0005470441 nova_compute[192626]: 2025-10-04 05:47:43.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:47:44.417 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:47:48 np0005470441 nova_compute[192626]: 2025-10-04 05:47:48.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:48 np0005470441 nova_compute[192626]: 2025-10-04 05:47:48.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:50 np0005470441 podman[231355]: 2025-10-04 05:47:50.322727258 +0000 UTC m=+0.081967098 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:47:52 np0005470441 podman[231381]: 2025-10-04 05:47:52.332423785 +0000 UTC m=+0.076678046 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 01:47:53 np0005470441 nova_compute[192626]: 2025-10-04 05:47:53.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:53 np0005470441 nova_compute[192626]: 2025-10-04 05:47:53.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:55 np0005470441 podman[231400]: 2025-10-04 05:47:55.319401252 +0000 UTC m=+0.076357807 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  4 01:47:58 np0005470441 nova_compute[192626]: 2025-10-04 05:47:58.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:47:58 np0005470441 nova_compute[192626]: 2025-10-04 05:47:58.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:48:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:48:03 np0005470441 nova_compute[192626]: 2025-10-04 05:48:03.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:03 np0005470441 podman[231424]: 2025-10-04 05:48:03.31478277 +0000 UTC m=+0.066776772 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:48:03 np0005470441 podman[231425]: 2025-10-04 05:48:03.336035081 +0000 UTC m=+0.075321128 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:48:03 np0005470441 nova_compute[192626]: 2025-10-04 05:48:03.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:06 np0005470441 nova_compute[192626]: 2025-10-04 05:48:06.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:06.755 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:06.756 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:06.756 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:07 np0005470441 nova_compute[192626]: 2025-10-04 05:48:07.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:08 np0005470441 nova_compute[192626]: 2025-10-04 05:48:08.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:08 np0005470441 podman[231465]: 2025-10-04 05:48:08.301407583 +0000 UTC m=+0.053741817 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:48:08 np0005470441 podman[231464]: 2025-10-04 05:48:08.320311096 +0000 UTC m=+0.074739030 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:48:08 np0005470441 nova_compute[192626]: 2025-10-04 05:48:08.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:08 np0005470441 nova_compute[192626]: 2025-10-04 05:48:08.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:08 np0005470441 nova_compute[192626]: 2025-10-04 05:48:08.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:48:11 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:11Z|00315|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  4 01:48:12 np0005470441 nova_compute[192626]: 2025-10-04 05:48:12.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:12 np0005470441 nova_compute[192626]: 2025-10-04 05:48:12.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:48:12 np0005470441 nova_compute[192626]: 2025-10-04 05:48:12.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:48:12 np0005470441 nova_compute[192626]: 2025-10-04 05:48:12.731 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:48:13 np0005470441 nova_compute[192626]: 2025-10-04 05:48:13.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:13 np0005470441 nova_compute[192626]: 2025-10-04 05:48:13.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:13 np0005470441 nova_compute[192626]: 2025-10-04 05:48:13.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:14 np0005470441 podman[231502]: 2025-10-04 05:48:14.294551088 +0000 UTC m=+0.051929665 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.750 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.751 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.751 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.751 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.948 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.950 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5736MB free_disk=73.42045974731445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.950 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:14 np0005470441 nova_compute[192626]: 2025-10-04 05:48:14.950 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:15 np0005470441 nova_compute[192626]: 2025-10-04 05:48:15.008 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:48:15 np0005470441 nova_compute[192626]: 2025-10-04 05:48:15.009 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:48:15 np0005470441 nova_compute[192626]: 2025-10-04 05:48:15.039 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:48:15 np0005470441 nova_compute[192626]: 2025-10-04 05:48:15.054 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:48:15 np0005470441 nova_compute[192626]: 2025-10-04 05:48:15.074 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:48:15 np0005470441 nova_compute[192626]: 2025-10-04 05:48:15.074 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:16 np0005470441 nova_compute[192626]: 2025-10-04 05:48:16.076 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:17 np0005470441 nova_compute[192626]: 2025-10-04 05:48:17.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.794 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.795 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.811 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.873 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.873 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.880 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.880 2 INFO nova.compute.claims [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.983 2 DEBUG nova.compute.provider_tree [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:48:18 np0005470441 nova_compute[192626]: 2025-10-04 05:48:18.998 2 DEBUG nova.scheduler.client.report [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.027 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.027 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.072 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.073 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.096 2 INFO nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.117 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.225 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.226 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.227 2 INFO nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Creating image(s)#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.227 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.228 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.229 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.245 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.310 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.311 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.312 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.324 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.389 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.390 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.457 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk 1073741824" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.458 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.460 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.523 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.524 2 DEBUG nova.virt.disk.api [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.525 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.602 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.605 2 DEBUG nova.virt.disk.api [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.605 2 DEBUG nova.objects.instance [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid 029e6753-0289-499a-82e9-f687cb1c3adc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.623 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.624 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Ensure instance console log exists: /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.625 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.625 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:19 np0005470441 nova_compute[192626]: 2025-10-04 05:48:19.625 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:20 np0005470441 nova_compute[192626]: 2025-10-04 05:48:20.148 2 DEBUG nova.policy [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:48:21 np0005470441 podman[231538]: 2025-10-04 05:48:21.338809888 +0000 UTC m=+0.090757161 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 01:48:22 np0005470441 nova_compute[192626]: 2025-10-04 05:48:22.116 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Successfully created port: d6685677-f121-4050-b030-28c2f7047497 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:48:23 np0005470441 nova_compute[192626]: 2025-10-04 05:48:23.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:23 np0005470441 podman[231562]: 2025-10-04 05:48:23.301227715 +0000 UTC m=+0.051335598 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:48:23 np0005470441 nova_compute[192626]: 2025-10-04 05:48:23.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:24 np0005470441 nova_compute[192626]: 2025-10-04 05:48:24.208 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Successfully created port: c6c4bf42-05a9-44dd-bcf7-59b8014196a8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:48:25 np0005470441 nova_compute[192626]: 2025-10-04 05:48:25.469 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Successfully updated port: d6685677-f121-4050-b030-28c2f7047497 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:48:25 np0005470441 nova_compute[192626]: 2025-10-04 05:48:25.613 2 DEBUG nova.compute.manager [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-changed-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:25 np0005470441 nova_compute[192626]: 2025-10-04 05:48:25.614 2 DEBUG nova.compute.manager [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing instance network info cache due to event network-changed-d6685677-f121-4050-b030-28c2f7047497. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:48:25 np0005470441 nova_compute[192626]: 2025-10-04 05:48:25.614 2 DEBUG oslo_concurrency.lockutils [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:48:25 np0005470441 nova_compute[192626]: 2025-10-04 05:48:25.615 2 DEBUG oslo_concurrency.lockutils [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:48:25 np0005470441 nova_compute[192626]: 2025-10-04 05:48:25.615 2 DEBUG nova.network.neutron [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing network info cache for port d6685677-f121-4050-b030-28c2f7047497 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:48:26 np0005470441 nova_compute[192626]: 2025-10-04 05:48:26.164 2 DEBUG nova.network.neutron [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:48:26 np0005470441 podman[231582]: 2025-10-04 05:48:26.381330241 +0000 UTC m=+0.129667391 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:48:26 np0005470441 nova_compute[192626]: 2025-10-04 05:48:26.542 2 DEBUG nova.network.neutron [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:48:26 np0005470441 nova_compute[192626]: 2025-10-04 05:48:26.585 2 DEBUG oslo_concurrency.lockutils [req-bf7d50ad-b0b6-4e05-b3a8-073933a140c0 req-077790c1-0054-4440-97ec-950326f4b3e7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.254 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Successfully updated port: c6c4bf42-05a9-44dd-bcf7-59b8014196a8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.275 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.276 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.276 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.450 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.700 2 DEBUG nova.compute.manager [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-changed-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.701 2 DEBUG nova.compute.manager [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing instance network info cache due to event network-changed-c6c4bf42-05a9-44dd-bcf7-59b8014196a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:48:27 np0005470441 nova_compute[192626]: 2025-10-04 05:48:27.701 2 DEBUG oslo_concurrency.lockutils [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:48:28 np0005470441 nova_compute[192626]: 2025-10-04 05:48:28.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:28 np0005470441 nova_compute[192626]: 2025-10-04 05:48:28.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.553 2 DEBUG nova.network.neutron [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updating instance_info_cache with network_info: [{"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.574 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.575 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Instance network_info: |[{"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.576 2 DEBUG oslo_concurrency.lockutils [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.576 2 DEBUG nova.network.neutron [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing network info cache for port c6c4bf42-05a9-44dd-bcf7-59b8014196a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.581 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Start _get_guest_xml network_info=[{"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.588 2 WARNING nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.599 2 DEBUG nova.virt.libvirt.host [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.600 2 DEBUG nova.virt.libvirt.host [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.604 2 DEBUG nova.virt.libvirt.host [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.605 2 DEBUG nova.virt.libvirt.host [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.606 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.607 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.608 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.608 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.608 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.609 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.609 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.609 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.609 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.610 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.610 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.610 2 DEBUG nova.virt.hardware [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.615 2 DEBUG nova.virt.libvirt.vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:48:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1505697440',display_name='tempest-TestGettingAddress-server-1505697440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1505697440',id=45,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBERgFZE1MaldIQq4gPcCQRw10bMcTHB63dPGKaLZZQjXzce2Mke7bN+c6lYQJxaL7JYgh8mZtoohT1+uZOpzztXA98MAr3JpSyg17ng8Y54oEnMQFhZ7mortR0tB12Q7+w==',key_name='tempest-TestGettingAddress-1907188162',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0ky1ur1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:48:19Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=029e6753-0289-499a-82e9-f687cb1c3adc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.615 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.616 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.617 2 DEBUG nova.virt.libvirt.vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:48:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1505697440',display_name='tempest-TestGettingAddress-server-1505697440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1505697440',id=45,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBERgFZE1MaldIQq4gPcCQRw10bMcTHB63dPGKaLZZQjXzce2Mke7bN+c6lYQJxaL7JYgh8mZtoohT1+uZOpzztXA98MAr3JpSyg17ng8Y54oEnMQFhZ7mortR0tB12Q7+w==',key_name='tempest-TestGettingAddress-1907188162',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0ky1ur1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:48:19Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=029e6753-0289-499a-82e9-f687cb1c3adc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.617 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.618 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.619 2 DEBUG nova.objects.instance [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 029e6753-0289-499a-82e9-f687cb1c3adc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.633 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <uuid>029e6753-0289-499a-82e9-f687cb1c3adc</uuid>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <name>instance-0000002d</name>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-1505697440</nova:name>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:48:29</nova:creationTime>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:port uuid="d6685677-f121-4050-b030-28c2f7047497">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        <nova:port uuid="c6c4bf42-05a9-44dd-bcf7-59b8014196a8">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:febd:925e" ipVersion="6"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <entry name="serial">029e6753-0289-499a-82e9-f687cb1c3adc</entry>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <entry name="uuid">029e6753-0289-499a-82e9-f687cb1c3adc</entry>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.config"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:a7:c1:74"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <target dev="tapd6685677-f1"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:bd:92:5e"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <target dev="tapc6c4bf42-05"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/console.log" append="off"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:48:29 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:48:29 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:48:29 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:48:29 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.634 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Preparing to wait for external event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.634 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.635 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.635 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.635 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Preparing to wait for external event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.636 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.636 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.636 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.637 2 DEBUG nova.virt.libvirt.vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:48:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1505697440',display_name='tempest-TestGettingAddress-server-1505697440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1505697440',id=45,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBERgFZE1MaldIQq4gPcCQRw10bMcTHB63dPGKaLZZQjXzce2Mke7bN+c6lYQJxaL7JYgh8mZtoohT1+uZOpzztXA98MAr3JpSyg17ng8Y54oEnMQFhZ7mortR0tB12Q7+w==',key_name='tempest-TestGettingAddress-1907188162',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0ky1ur1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:48:19Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=029e6753-0289-499a-82e9-f687cb1c3adc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.637 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.638 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.638 2 DEBUG os_vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.639 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.640 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6685677-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6685677-f1, col_values=(('external_ids', {'iface-id': 'd6685677-f121-4050-b030-28c2f7047497', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a7:c1:74', 'vm-uuid': '029e6753-0289-499a-82e9-f687cb1c3adc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:48:29 np0005470441 NetworkManager[51690]: <info>  [1759556909.6455] manager: (tapd6685677-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.652 2 INFO os_vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1')#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.653 2 DEBUG nova.virt.libvirt.vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:48:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1505697440',display_name='tempest-TestGettingAddress-server-1505697440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1505697440',id=45,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBERgFZE1MaldIQq4gPcCQRw10bMcTHB63dPGKaLZZQjXzce2Mke7bN+c6lYQJxaL7JYgh8mZtoohT1+uZOpzztXA98MAr3JpSyg17ng8Y54oEnMQFhZ7mortR0tB12Q7+w==',key_name='tempest-TestGettingAddress-1907188162',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0ky1ur1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:48:19Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=029e6753-0289-499a-82e9-f687cb1c3adc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.653 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.654 2 DEBUG nova.network.os_vif_util [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.654 2 DEBUG os_vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.655 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.655 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.657 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc6c4bf42-05, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.658 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc6c4bf42-05, col_values=(('external_ids', {'iface-id': 'c6c4bf42-05a9-44dd-bcf7-59b8014196a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:92:5e', 'vm-uuid': '029e6753-0289-499a-82e9-f687cb1c3adc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:29 np0005470441 NetworkManager[51690]: <info>  [1759556909.6607] manager: (tapc6c4bf42-05): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.668 2 INFO os_vif [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05')#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.728 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.729 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.729 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:a7:c1:74, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.729 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:bd:92:5e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:48:29 np0005470441 nova_compute[192626]: 2025-10-04 05:48:29.730 2 INFO nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Using config drive#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.028 2 INFO nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Creating config drive at /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.config#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.033 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0ee3760y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.165 2 DEBUG oslo_concurrency.processutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0ee3760y" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:48:30 np0005470441 kernel: tapd6685677-f1: entered promiscuous mode
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.2463] manager: (tapd6685677-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/148)
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00316|binding|INFO|Claiming lport d6685677-f121-4050-b030-28c2f7047497 for this chassis.
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00317|binding|INFO|d6685677-f121-4050-b030-28c2f7047497: Claiming fa:16:3e:a7:c1:74 10.100.0.7
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.3230] manager: (tapc6c4bf42-05): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.3257] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/150)
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.3264] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.340 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:c1:74 10.100.0.7'], port_security=['fa:16:3e:a7:c1:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '029e6753-0289-499a-82e9-f687cb1c3adc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6815d8ee-74cf-4ed7-8e8b-b2daf532177a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f35b87d2-355f-4fd4-aa70-51ec7a8b0e9a, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=d6685677-f121-4050-b030-28c2f7047497) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.342 103689 INFO neutron.agent.ovn.metadata.agent [-] Port d6685677-f121-4050-b030-28c2f7047497 in datapath 1572d154-338b-48bc-bb2d-34c8fe54b7b5 bound to our chassis#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.343 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1572d154-338b-48bc-bb2d-34c8fe54b7b5#033[00m
Oct  4 01:48:30 np0005470441 systemd-udevd[231634]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:48:30 np0005470441 systemd-udevd[231631]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.353 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[25600603-7b0a-45d5-a3ee-23c32b05df97]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.354 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1572d154-31 in ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.357 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1572d154-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.357 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a76fbaec-4d8d-43ff-9b8b-5c3f47278404]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.358 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[72f11c20-e2ae-4372-9c07-0ded272460d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.3660] device (tapd6685677-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.3674] device (tapd6685677-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.371 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[7495c068-0963-4d24-a643-6255d513bae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 systemd-machined[152624]: New machine qemu-25-instance-0000002d.
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.403 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[084c332e-dcb8-4dc6-a1fa-73007acc1f88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 kernel: tapc6c4bf42-05: entered promiscuous mode
Oct  4 01:48:30 np0005470441 systemd[1]: Started Virtual Machine qemu-25-instance-0000002d.
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.4257] device (tapc6c4bf42-05): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.4275] device (tapc6c4bf42-05): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00318|binding|INFO|Claiming lport c6c4bf42-05a9-44dd-bcf7-59b8014196a8 for this chassis.
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00319|binding|INFO|c6c4bf42-05a9-44dd-bcf7-59b8014196a8: Claiming fa:16:3e:bd:92:5e 2001:db8::f816:3eff:febd:925e
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.443 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[0f5adfaf-6c50-4859-abc7-f2f7f9206972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00320|binding|INFO|Setting lport d6685677-f121-4050-b030-28c2f7047497 ovn-installed in OVS
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 systemd-udevd[231639]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.455 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c0417645-72cb-4134-a51e-8a7b7e28e18e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.4568] manager: (tap1572d154-30): new Veth device (/org/freedesktop/NetworkManager/Devices/152)
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.458 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:92:5e 2001:db8::f816:3eff:febd:925e'], port_security=['fa:16:3e:bd:92:5e 2001:db8::f816:3eff:febd:925e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:febd:925e/64', 'neutron:device_id': '029e6753-0289-499a-82e9-f687cb1c3adc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1d8eda1-499a-4d9d-af36-998257565133', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6815d8ee-74cf-4ed7-8e8b-b2daf532177a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90d20d70-71ab-44e8-8c3c-f17c571c614c, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=c6c4bf42-05a9-44dd-bcf7-59b8014196a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00321|binding|INFO|Setting lport d6685677-f121-4050-b030-28c2f7047497 up in Southbound
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00322|binding|INFO|Setting lport c6c4bf42-05a9-44dd-bcf7-59b8014196a8 ovn-installed in OVS
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00323|binding|INFO|Setting lport c6c4bf42-05a9-44dd-bcf7-59b8014196a8 up in Southbound
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.497 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[bd432730-888c-4dd9-9b94-75d47b678602]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.501 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[8198b796-74d2-4301-ba82-258f8f34276e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.5276] device (tap1572d154-30): carrier: link connected
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.534 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[4a7ecc33-dd3c-4126-a1ad-5ec0f69ec1fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.555 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3ffd9592-9e82-4d4b-b44b-4862cdc5098f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1572d154-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:83:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480818, 'reachable_time': 28796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231667, 'error': None, 'target': 'ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.574 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[98cd5fe7-b492-4873-8514-6d6b18b14fb6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:8340'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480818, 'tstamp': 480818}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231668, 'error': None, 'target': 'ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.594 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7622eb62-733b-44ac-9ae1-b04f0ae247ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1572d154-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:83:40'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480818, 'reachable_time': 28796, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231669, 'error': None, 'target': 'ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.637 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5f969874-0406-47f9-aee1-f4f08bf2ad12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.704 2 DEBUG nova.compute.manager [req-21d225cb-ba8a-41ec-a73a-1c3b2415d402 req-8eec8ef3-f9c0-4e2e-81de-33d92802e779 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.705 2 DEBUG oslo_concurrency.lockutils [req-21d225cb-ba8a-41ec-a73a-1c3b2415d402 req-8eec8ef3-f9c0-4e2e-81de-33d92802e779 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.706 2 DEBUG oslo_concurrency.lockutils [req-21d225cb-ba8a-41ec-a73a-1c3b2415d402 req-8eec8ef3-f9c0-4e2e-81de-33d92802e779 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.706 2 DEBUG oslo_concurrency.lockutils [req-21d225cb-ba8a-41ec-a73a-1c3b2415d402 req-8eec8ef3-f9c0-4e2e-81de-33d92802e779 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.706 2 DEBUG nova.compute.manager [req-21d225cb-ba8a-41ec-a73a-1c3b2415d402 req-8eec8ef3-f9c0-4e2e-81de-33d92802e779 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Processing event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.717 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d32b139a-426c-43d8-b514-24793c8fdf6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.718 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1572d154-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.718 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.718 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1572d154-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:30 np0005470441 NetworkManager[51690]: <info>  [1759556910.7212] manager: (tap1572d154-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Oct  4 01:48:30 np0005470441 kernel: tap1572d154-30: entered promiscuous mode
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.723 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1572d154-30, col_values=(('external_ids', {'iface-id': '4ccb7c52-91c2-44dd-9cab-9ef950fabe13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:30 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:30Z|00324|binding|INFO|Releasing lport 4ccb7c52-91c2-44dd-9cab-9ef950fabe13 from this chassis (sb_readonly=0)
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.737 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1572d154-338b-48bc-bb2d-34c8fe54b7b5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1572d154-338b-48bc-bb2d-34c8fe54b7b5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.738 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c3455bfa-9502-4a7c-be7a-696bf07c9417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.739 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-1572d154-338b-48bc-bb2d-34c8fe54b7b5
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/1572d154-338b-48bc-bb2d-34c8fe54b7b5.pid.haproxy
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 1572d154-338b-48bc-bb2d-34c8fe54b7b5
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.740 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'env', 'PROCESS_TAG=haproxy-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1572d154-338b-48bc-bb2d-34c8fe54b7b5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.740 2 DEBUG nova.network.neutron [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updated VIF entry in instance network info cache for port c6c4bf42-05a9-44dd-bcf7-59b8014196a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.740 2 DEBUG nova.network.neutron [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updating instance_info_cache with network_info: [{"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.772 2 DEBUG oslo_concurrency.lockutils [req-c14576de-b326-4ee9-aeab-f8a86346c239 req-6e6c038f-20d0-4240-beb0-289b9c983123 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:48:30 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:30.789 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.828 2 DEBUG nova.compute.manager [req-65907c84-00ae-4352-ac13-ef5eb7e1a684 req-36224e41-4e18-45b6-99b9-a358d183666c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.829 2 DEBUG oslo_concurrency.lockutils [req-65907c84-00ae-4352-ac13-ef5eb7e1a684 req-36224e41-4e18-45b6-99b9-a358d183666c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.829 2 DEBUG oslo_concurrency.lockutils [req-65907c84-00ae-4352-ac13-ef5eb7e1a684 req-36224e41-4e18-45b6-99b9-a358d183666c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.830 2 DEBUG oslo_concurrency.lockutils [req-65907c84-00ae-4352-ac13-ef5eb7e1a684 req-36224e41-4e18-45b6-99b9-a358d183666c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:30 np0005470441 nova_compute[192626]: 2025-10-04 05:48:30.830 2 DEBUG nova.compute.manager [req-65907c84-00ae-4352-ac13-ef5eb7e1a684 req-36224e41-4e18-45b6-99b9-a358d183666c 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Processing event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:48:31 np0005470441 podman[231709]: 2025-10-04 05:48:31.16507344 +0000 UTC m=+0.057283619 container create 72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:48:31 np0005470441 systemd[1]: Started libpod-conmon-72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf.scope.
Oct  4 01:48:31 np0005470441 podman[231709]: 2025-10-04 05:48:31.134250593 +0000 UTC m=+0.026460822 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:48:31 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:48:31 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5165ceea94b10106092d6c906f2c0d5126d0771e7cf07f26b1e2370bb0a086e9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:48:31 np0005470441 podman[231709]: 2025-10-04 05:48:31.259404163 +0000 UTC m=+0.151614382 container init 72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  4 01:48:31 np0005470441 podman[231709]: 2025-10-04 05:48:31.272680855 +0000 UTC m=+0.164891084 container start 72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  4 01:48:31 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [NOTICE]   (231728) : New worker (231730) forked
Oct  4 01:48:31 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [NOTICE]   (231728) : Loading success.
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.331 103689 INFO neutron.agent.ovn.metadata.agent [-] Port c6c4bf42-05a9-44dd-bcf7-59b8014196a8 in datapath a1d8eda1-499a-4d9d-af36-998257565133 unbound from our chassis#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.334 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a1d8eda1-499a-4d9d-af36-998257565133#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.346 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[45669771-2ffe-4abb-a11a-6004315154dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.348 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa1d8eda1-41 in ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.350 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa1d8eda1-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.350 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[17ed393d-bb59-4cd6-8cd9-d79a996a98fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.352 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[dcede0c7-c969-4df7-b285-12680d4de152]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.368 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[26c1188c-87be-400d-ae82-c16e9be031f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.385 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556911.3853185, 029e6753-0289-499a-82e9-f687cb1c3adc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.386 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] VM Started (Lifecycle Event)#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.388 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.392 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.393 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[345044c0-6951-4579-922e-62933283abcf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.396 2 INFO nova.virt.libvirt.driver [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Instance spawned successfully.#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.397 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.407 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.411 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.422 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.422 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.423 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.424 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.425 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.426 2 DEBUG nova.virt.libvirt.driver [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.431 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[21574cdb-5ce8-4aec-a2d5-1600e5ca3284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.432 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.433 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556911.386275, 029e6753-0289-499a-82e9-f687cb1c3adc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.433 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:48:31 np0005470441 systemd-udevd[231651]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:48:31 np0005470441 NetworkManager[51690]: <info>  [1759556911.4407] manager: (tapa1d8eda1-40): new Veth device (/org/freedesktop/NetworkManager/Devices/154)
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.439 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e31225b5-75bf-4bb8-be28-c02f608dfef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.461 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.465 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759556911.3912613, 029e6753-0289-499a-82e9-f687cb1c3adc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.465 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.492 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ea078657-5159-40c3-a43a-6688ff2f7c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.493 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.496 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[75a32e58-3382-434a-a91b-3fbeed6812f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.499 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.503 2 INFO nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Took 12.28 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.504 2 DEBUG nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.517 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:48:31 np0005470441 NetworkManager[51690]: <info>  [1759556911.5294] device (tapa1d8eda1-40): carrier: link connected
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.536 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[54a80e12-8955-4ecf-858e-3a9eb4cd0aa7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.553 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f9e08df2-449a-43ea-ac47-2b41c360ac98]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1d8eda1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:77:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480918, 'reachable_time': 42559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231749, 'error': None, 'target': 'ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.572 2 INFO nova.compute.manager [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Took 12.72 seconds to build instance.#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.573 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5de2d156-4569-4cc2-864e-412dc4a9253c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:77af'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 480918, 'tstamp': 480918}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231750, 'error': None, 'target': 'ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.591 2 DEBUG oslo_concurrency.lockutils [None req-7c5bbc93-41fe-4200-a016-a5c199c669fe 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.594 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7970e75e-1fb3-4220-83e7-005ca0fa5718]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa1d8eda1-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:77:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 87], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480918, 'reachable_time': 42559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231751, 'error': None, 'target': 'ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.629 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6371551f-73be-41e5-bea7-029e943c7b58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.666 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bb4a7aa0-5eeb-46d9-b6cf-34c450b53fc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.668 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1d8eda1-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.668 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.669 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1d8eda1-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:31 np0005470441 kernel: tapa1d8eda1-40: entered promiscuous mode
Oct  4 01:48:31 np0005470441 NetworkManager[51690]: <info>  [1759556911.7157] manager: (tapa1d8eda1-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.722 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa1d8eda1-40, col_values=(('external_ids', {'iface-id': '2d99649a-a9f8-440d-a3a9-d741529db75b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:31 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:31Z|00325|binding|INFO|Releasing lport 2d99649a-a9f8-440d-a3a9-d741529db75b from this chassis (sb_readonly=0)
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.729 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a1d8eda1-499a-4d9d-af36-998257565133.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a1d8eda1-499a-4d9d-af36-998257565133.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.730 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ee19c6e9-b2d1-41cb-b93a-f3e6e2fefc45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.731 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-a1d8eda1-499a-4d9d-af36-998257565133
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/a1d8eda1-499a-4d9d-af36-998257565133.pid.haproxy
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID a1d8eda1-499a-4d9d-af36-998257565133
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:48:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:31.731 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133', 'env', 'PROCESS_TAG=haproxy-a1d8eda1-499a-4d9d-af36-998257565133', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a1d8eda1-499a-4d9d-af36-998257565133.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:31 np0005470441 nova_compute[192626]: 2025-10-04 05:48:31.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:32 np0005470441 podman[231781]: 2025-10-04 05:48:32.149697312 +0000 UTC m=+0.067804821 container create 1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:48:32 np0005470441 systemd[1]: Started libpod-conmon-1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e.scope.
Oct  4 01:48:32 np0005470441 podman[231781]: 2025-10-04 05:48:32.108431245 +0000 UTC m=+0.026538744 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:48:32 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:48:32 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba1f5750e51ed5d6e07cab52f63d056460f043b2912ae93cbce6ec26a2797f72/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:48:32 np0005470441 podman[231781]: 2025-10-04 05:48:32.240812713 +0000 UTC m=+0.158920202 container init 1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:48:32 np0005470441 podman[231781]: 2025-10-04 05:48:32.252416687 +0000 UTC m=+0.170524156 container start 1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:48:32 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [NOTICE]   (231800) : New worker (231802) forked
Oct  4 01:48:32 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [NOTICE]   (231800) : Loading success.
Oct  4 01:48:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:32.346 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.783 2 DEBUG nova.compute.manager [req-89910445-c522-42d7-b686-6b6dad9ea5e5 req-1a8db9ce-fa8d-41a8-936f-1799123be945 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.784 2 DEBUG oslo_concurrency.lockutils [req-89910445-c522-42d7-b686-6b6dad9ea5e5 req-1a8db9ce-fa8d-41a8-936f-1799123be945 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.785 2 DEBUG oslo_concurrency.lockutils [req-89910445-c522-42d7-b686-6b6dad9ea5e5 req-1a8db9ce-fa8d-41a8-936f-1799123be945 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.785 2 DEBUG oslo_concurrency.lockutils [req-89910445-c522-42d7-b686-6b6dad9ea5e5 req-1a8db9ce-fa8d-41a8-936f-1799123be945 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.785 2 DEBUG nova.compute.manager [req-89910445-c522-42d7-b686-6b6dad9ea5e5 req-1a8db9ce-fa8d-41a8-936f-1799123be945 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] No waiting events found dispatching network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.786 2 WARNING nova.compute.manager [req-89910445-c522-42d7-b686-6b6dad9ea5e5 req-1a8db9ce-fa8d-41a8-936f-1799123be945 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received unexpected event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.907 2 DEBUG nova.compute.manager [req-9866800a-91b8-4e3b-bf5f-c75b4f406715 req-c71b481c-bf86-42c0-b451-ec972e9b19de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.908 2 DEBUG oslo_concurrency.lockutils [req-9866800a-91b8-4e3b-bf5f-c75b4f406715 req-c71b481c-bf86-42c0-b451-ec972e9b19de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.909 2 DEBUG oslo_concurrency.lockutils [req-9866800a-91b8-4e3b-bf5f-c75b4f406715 req-c71b481c-bf86-42c0-b451-ec972e9b19de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.909 2 DEBUG oslo_concurrency.lockutils [req-9866800a-91b8-4e3b-bf5f-c75b4f406715 req-c71b481c-bf86-42c0-b451-ec972e9b19de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.909 2 DEBUG nova.compute.manager [req-9866800a-91b8-4e3b-bf5f-c75b4f406715 req-c71b481c-bf86-42c0-b451-ec972e9b19de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] No waiting events found dispatching network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:48:32 np0005470441 nova_compute[192626]: 2025-10-04 05:48:32.910 2 WARNING nova.compute.manager [req-9866800a-91b8-4e3b-bf5f-c75b4f406715 req-c71b481c-bf86-42c0-b451-ec972e9b19de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received unexpected event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:48:33 np0005470441 nova_compute[192626]: 2025-10-04 05:48:33.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:34 np0005470441 podman[231812]: 2025-10-04 05:48:34.335866654 +0000 UTC m=+0.074529335 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:48:34 np0005470441 podman[231813]: 2025-10-04 05:48:34.342059512 +0000 UTC m=+0.074402111 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:48:34 np0005470441 nova_compute[192626]: 2025-10-04 05:48:34.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:37 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:37.349 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:38 np0005470441 nova_compute[192626]: 2025-10-04 05:48:38.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:38 np0005470441 nova_compute[192626]: 2025-10-04 05:48:38.597 2 DEBUG nova.compute.manager [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-changed-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:38 np0005470441 nova_compute[192626]: 2025-10-04 05:48:38.599 2 DEBUG nova.compute.manager [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing instance network info cache due to event network-changed-d6685677-f121-4050-b030-28c2f7047497. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:48:38 np0005470441 nova_compute[192626]: 2025-10-04 05:48:38.600 2 DEBUG oslo_concurrency.lockutils [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:48:38 np0005470441 nova_compute[192626]: 2025-10-04 05:48:38.600 2 DEBUG oslo_concurrency.lockutils [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:48:38 np0005470441 nova_compute[192626]: 2025-10-04 05:48:38.600 2 DEBUG nova.network.neutron [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing network info cache for port d6685677-f121-4050-b030-28c2f7047497 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:48:39 np0005470441 podman[231856]: 2025-10-04 05:48:39.332350653 +0000 UTC m=+0.081042052 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  4 01:48:39 np0005470441 podman[231857]: 2025-10-04 05:48:39.336832002 +0000 UTC m=+0.076397229 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:48:39 np0005470441 nova_compute[192626]: 2025-10-04 05:48:39.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:40 np0005470441 nova_compute[192626]: 2025-10-04 05:48:40.290 2 DEBUG nova.network.neutron [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updated VIF entry in instance network info cache for port d6685677-f121-4050-b030-28c2f7047497. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:48:40 np0005470441 nova_compute[192626]: 2025-10-04 05:48:40.298 2 DEBUG nova.network.neutron [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updating instance_info_cache with network_info: [{"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:48:40 np0005470441 nova_compute[192626]: 2025-10-04 05:48:40.329 2 DEBUG oslo_concurrency.lockutils [req-61341f07-f6b0-4738-b6d3-07afd5be9f1f req-100abca9-f522-4bc2-92f9-7bb067942495 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:48:43 np0005470441 nova_compute[192626]: 2025-10-04 05:48:43.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:44 np0005470441 nova_compute[192626]: 2025-10-04 05:48:44.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:44 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:44Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a7:c1:74 10.100.0.7
Oct  4 01:48:44 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:44Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a7:c1:74 10.100.0.7
Oct  4 01:48:45 np0005470441 podman[231903]: 2025-10-04 05:48:45.347454661 +0000 UTC m=+0.093361106 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  4 01:48:48 np0005470441 nova_compute[192626]: 2025-10-04 05:48:48.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:49 np0005470441 nova_compute[192626]: 2025-10-04 05:48:49.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:52 np0005470441 podman[231927]: 2025-10-04 05:48:52.303565509 +0000 UTC m=+0.054758876 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:48:53 np0005470441 nova_compute[192626]: 2025-10-04 05:48:53.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:54 np0005470441 podman[231951]: 2025-10-04 05:48:54.334729572 +0000 UTC m=+0.079959751 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Oct  4 01:48:54 np0005470441 nova_compute[192626]: 2025-10-04 05:48:54.669 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.867 2 DEBUG nova.compute.manager [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-changed-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.867 2 DEBUG nova.compute.manager [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing instance network info cache due to event network-changed-d6685677-f121-4050-b030-28c2f7047497. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.868 2 DEBUG oslo_concurrency.lockutils [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.868 2 DEBUG oslo_concurrency.lockutils [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.869 2 DEBUG nova.network.neutron [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Refreshing network info cache for port d6685677-f121-4050-b030-28c2f7047497 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.923 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.924 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.925 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.925 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.925 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.927 2 INFO nova.compute.manager [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Terminating instance#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.929 2 DEBUG nova.compute.manager [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:48:56 np0005470441 kernel: tapd6685677-f1 (unregistering): left promiscuous mode
Oct  4 01:48:56 np0005470441 NetworkManager[51690]: <info>  [1759556936.9512] device (tapd6685677-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:48:56 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:56Z|00326|binding|INFO|Releasing lport d6685677-f121-4050-b030-28c2f7047497 from this chassis (sb_readonly=0)
Oct  4 01:48:56 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:56Z|00327|binding|INFO|Setting lport d6685677-f121-4050-b030-28c2f7047497 down in Southbound
Oct  4 01:48:56 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:56Z|00328|binding|INFO|Removing iface tapd6685677-f1 ovn-installed in OVS
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:56.973 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:c1:74 10.100.0.7'], port_security=['fa:16:3e:a7:c1:74 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '029e6753-0289-499a-82e9-f687cb1c3adc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6815d8ee-74cf-4ed7-8e8b-b2daf532177a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f35b87d2-355f-4fd4-aa70-51ec7a8b0e9a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=d6685677-f121-4050-b030-28c2f7047497) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:48:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:56.977 103689 INFO neutron.agent.ovn.metadata.agent [-] Port d6685677-f121-4050-b030-28c2f7047497 in datapath 1572d154-338b-48bc-bb2d-34c8fe54b7b5 unbound from our chassis#033[00m
Oct  4 01:48:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:56.980 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1572d154-338b-48bc-bb2d-34c8fe54b7b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:48:56 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:56.982 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5ae8aaf9-b40f-4f72-a719-03d7e3e66a3a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:56 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:56.983 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5 namespace which is not needed anymore#033[00m
Oct  4 01:48:56 np0005470441 kernel: tapc6c4bf42-05 (unregistering): left promiscuous mode
Oct  4 01:48:56 np0005470441 NetworkManager[51690]: <info>  [1759556936.9886] device (tapc6c4bf42-05): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:56.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:56Z|00329|binding|INFO|Releasing lport c6c4bf42-05a9-44dd-bcf7-59b8014196a8 from this chassis (sb_readonly=0)
Oct  4 01:48:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:56Z|00330|binding|INFO|Setting lport c6c4bf42-05a9-44dd-bcf7-59b8014196a8 down in Southbound
Oct  4 01:48:57 np0005470441 ovn_controller[94840]: 2025-10-04T05:48:56Z|00331|binding|INFO|Removing iface tapc6c4bf42-05 ovn-installed in OVS
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.008 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:92:5e 2001:db8::f816:3eff:febd:925e'], port_security=['fa:16:3e:bd:92:5e 2001:db8::f816:3eff:febd:925e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:febd:925e/64', 'neutron:device_id': '029e6753-0289-499a-82e9-f687cb1c3adc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a1d8eda1-499a-4d9d-af36-998257565133', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6815d8ee-74cf-4ed7-8e8b-b2daf532177a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=90d20d70-71ab-44e8-8c3c-f17c571c614c, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=c6c4bf42-05a9-44dd-bcf7-59b8014196a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Oct  4 01:48:57 np0005470441 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d0000002d.scope: Consumed 14.354s CPU time.
Oct  4 01:48:57 np0005470441 systemd-machined[152624]: Machine qemu-25-instance-0000002d terminated.
Oct  4 01:48:57 np0005470441 podman[231971]: 2025-10-04 05:48:57.111408592 +0000 UTC m=+0.122049752 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [NOTICE]   (231728) : haproxy version is 2.8.14-c23fe91
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [NOTICE]   (231728) : path to executable is /usr/sbin/haproxy
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [WARNING]  (231728) : Exiting Master process...
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [ALERT]    (231728) : Current worker (231730) exited with code 143 (Terminated)
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5[231724]: [WARNING]  (231728) : All workers exited. Exiting... (0)
Oct  4 01:48:57 np0005470441 systemd[1]: libpod-72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf.scope: Deactivated successfully.
Oct  4 01:48:57 np0005470441 podman[232026]: 2025-10-04 05:48:57.146172312 +0000 UTC m=+0.050124913 container died 72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:48:57 np0005470441 NetworkManager[51690]: <info>  [1759556937.1639] manager: (tapc6c4bf42-05): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Oct  4 01:48:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf-userdata-shm.mount: Deactivated successfully.
Oct  4 01:48:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay-5165ceea94b10106092d6c906f2c0d5126d0771e7cf07f26b1e2370bb0a086e9-merged.mount: Deactivated successfully.
Oct  4 01:48:57 np0005470441 podman[232026]: 2025-10-04 05:48:57.193961327 +0000 UTC m=+0.097913938 container cleanup 72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.202 2 INFO nova.virt.libvirt.driver [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Instance destroyed successfully.#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.203 2 DEBUG nova.objects.instance [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid 029e6753-0289-499a-82e9-f687cb1c3adc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:48:57 np0005470441 systemd[1]: libpod-conmon-72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf.scope: Deactivated successfully.
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.224 2 DEBUG nova.virt.libvirt.vif [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:48:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1505697440',display_name='tempest-TestGettingAddress-server-1505697440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1505697440',id=45,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBERgFZE1MaldIQq4gPcCQRw10bMcTHB63dPGKaLZZQjXzce2Mke7bN+c6lYQJxaL7JYgh8mZtoohT1+uZOpzztXA98MAr3JpSyg17ng8Y54oEnMQFhZ7mortR0tB12Q7+w==',key_name='tempest-TestGettingAddress-1907188162',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:48:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0ky1ur1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:48:31Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=029e6753-0289-499a-82e9-f687cb1c3adc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.224 2 DEBUG nova.network.os_vif_util [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.225 2 DEBUG nova.network.os_vif_util [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.226 2 DEBUG os_vif [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6685677-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.238 2 INFO os_vif [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a7:c1:74,bridge_name='br-int',has_traffic_filtering=True,id=d6685677-f121-4050-b030-28c2f7047497,network=Network(1572d154-338b-48bc-bb2d-34c8fe54b7b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6685677-f1')#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.239 2 DEBUG nova.virt.libvirt.vif [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:48:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1505697440',display_name='tempest-TestGettingAddress-server-1505697440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1505697440',id=45,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBERgFZE1MaldIQq4gPcCQRw10bMcTHB63dPGKaLZZQjXzce2Mke7bN+c6lYQJxaL7JYgh8mZtoohT1+uZOpzztXA98MAr3JpSyg17ng8Y54oEnMQFhZ7mortR0tB12Q7+w==',key_name='tempest-TestGettingAddress-1907188162',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:48:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0ky1ur1l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:48:31Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=029e6753-0289-499a-82e9-f687cb1c3adc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.240 2 DEBUG nova.network.os_vif_util [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.240 2 DEBUG nova.network.os_vif_util [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.241 2 DEBUG os_vif [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.242 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc6c4bf42-05, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.246 2 INFO os_vif [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:92:5e,bridge_name='br-int',has_traffic_filtering=True,id=c6c4bf42-05a9-44dd-bcf7-59b8014196a8,network=Network(a1d8eda1-499a-4d9d-af36-998257565133),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc6c4bf42-05')#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.247 2 INFO nova.virt.libvirt.driver [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Deleting instance files /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc_del#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.247 2 INFO nova.virt.libvirt.driver [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Deletion of /var/lib/nova/instances/029e6753-0289-499a-82e9-f687cb1c3adc_del complete#033[00m
Oct  4 01:48:57 np0005470441 podman[232081]: 2025-10-04 05:48:57.255095585 +0000 UTC m=+0.040151516 container remove 72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.259 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1645e77a-a812-4e12-bfcd-3591ad82291f]: (4, ('Sat Oct  4 05:48:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5 (72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf)\n72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf\nSat Oct  4 05:48:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5 (72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf)\n72d7eb50c4efe16933b834947552f4e98856aad275c5b1704fad339168b97ebf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.261 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1f29b221-505a-4a51-94ad-975303d75e1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.262 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1572d154-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:57 np0005470441 kernel: tap1572d154-30: left promiscuous mode
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.277 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f9eb233e-6e1a-4c9b-bb60-0cc119accdd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.298 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb6957b-2f33-4af9-ac9c-bda0a281ecbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.300 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[43a89689-2e73-4f13-a2c9-84fdd0f47f9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.317 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[87e1a278-35c1-4dc5-a1c8-702dd24dd294]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480809, 'reachable_time': 25246, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232098, 'error': None, 'target': 'ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.319 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1572d154-338b-48bc-bb2d-34c8fe54b7b5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.319 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[4d726ce9-93b0-4194-8d5f-8e8fef1c6a1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.320 103689 INFO neutron.agent.ovn.metadata.agent [-] Port c6c4bf42-05a9-44dd-bcf7-59b8014196a8 in datapath a1d8eda1-499a-4d9d-af36-998257565133 unbound from our chassis#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.321 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a1d8eda1-499a-4d9d-af36-998257565133, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:48:57 np0005470441 systemd[1]: run-netns-ovnmeta\x2d1572d154\x2d338b\x2d48bc\x2dbb2d\x2d34c8fe54b7b5.mount: Deactivated successfully.
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.321 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[144870f5-840e-4307-ba01-eb5dbec6dc1d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.321 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133 namespace which is not needed anymore#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.351 2 INFO nova.compute.manager [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.352 2 DEBUG oslo.service.loopingcall [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.352 2 DEBUG nova.compute.manager [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.352 2 DEBUG nova.network.neutron [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [NOTICE]   (231800) : haproxy version is 2.8.14-c23fe91
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [NOTICE]   (231800) : path to executable is /usr/sbin/haproxy
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [WARNING]  (231800) : Exiting Master process...
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [WARNING]  (231800) : Exiting Master process...
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [ALERT]    (231800) : Current worker (231802) exited with code 143 (Terminated)
Oct  4 01:48:57 np0005470441 neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133[231796]: [WARNING]  (231800) : All workers exited. Exiting... (0)
Oct  4 01:48:57 np0005470441 systemd[1]: libpod-1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e.scope: Deactivated successfully.
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.471 2 DEBUG nova.compute.manager [req-7a465c0e-708e-47db-9417-50e5c2cc7f53 req-e78a45c9-46e2-4c5a-8598-3f9c6451bb8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-unplugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.472 2 DEBUG oslo_concurrency.lockutils [req-7a465c0e-708e-47db-9417-50e5c2cc7f53 req-e78a45c9-46e2-4c5a-8598-3f9c6451bb8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:57 np0005470441 podman[232118]: 2025-10-04 05:48:57.472424505 +0000 UTC m=+0.052994185 container died 1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.472 2 DEBUG oslo_concurrency.lockutils [req-7a465c0e-708e-47db-9417-50e5c2cc7f53 req-e78a45c9-46e2-4c5a-8598-3f9c6451bb8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.473 2 DEBUG oslo_concurrency.lockutils [req-7a465c0e-708e-47db-9417-50e5c2cc7f53 req-e78a45c9-46e2-4c5a-8598-3f9c6451bb8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.473 2 DEBUG nova.compute.manager [req-7a465c0e-708e-47db-9417-50e5c2cc7f53 req-e78a45c9-46e2-4c5a-8598-3f9c6451bb8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] No waiting events found dispatching network-vif-unplugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.474 2 DEBUG nova.compute.manager [req-7a465c0e-708e-47db-9417-50e5c2cc7f53 req-e78a45c9-46e2-4c5a-8598-3f9c6451bb8a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-unplugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:48:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e-userdata-shm.mount: Deactivated successfully.
Oct  4 01:48:57 np0005470441 systemd[1]: var-lib-containers-storage-overlay-ba1f5750e51ed5d6e07cab52f63d056460f043b2912ae93cbce6ec26a2797f72-merged.mount: Deactivated successfully.
Oct  4 01:48:57 np0005470441 podman[232118]: 2025-10-04 05:48:57.51289845 +0000 UTC m=+0.093468140 container cleanup 1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:48:57 np0005470441 systemd[1]: libpod-conmon-1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e.scope: Deactivated successfully.
Oct  4 01:48:57 np0005470441 podman[232145]: 2025-10-04 05:48:57.587116324 +0000 UTC m=+0.050161094 container remove 1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.593 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[67e9bd3d-74b2-4ab2-82b9-632a6127ad90]: (4, ('Sat Oct  4 05:48:57 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133 (1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e)\n1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e\nSat Oct  4 05:48:57 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133 (1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e)\n1f41a790adfc1779c6227a3e8815c26a2950be0d25f9575bf7d65a6e1874f62e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.595 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[149e3801-91ce-4148-b0fa-62765351f40d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.596 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1d8eda1-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 kernel: tapa1d8eda1-40: left promiscuous mode
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.604 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9f6c93-b268-4db9-b03f-bd415d2223a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 nova_compute[192626]: 2025-10-04 05:48:57.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.636 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3ad7d4-49e6-429d-afe9-3aa49d47f43d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.638 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c0889b03-8901-427c-9893-aa7f8b5b9b79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.665 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2ecdadf6-c483-4d04-9a92-20048eff795d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 480907, 'reachable_time': 34829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232164, 'error': None, 'target': 'ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.667 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a1d8eda1-499a-4d9d-af36-998257565133 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:48:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:48:57.668 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[781bcddd-b32a-433d-9a53-0e5383526e3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:48:58 np0005470441 systemd[1]: run-netns-ovnmeta\x2da1d8eda1\x2d499a\x2d4d9d\x2daf36\x2d998257565133.mount: Deactivated successfully.
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.340 2 DEBUG nova.network.neutron [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updated VIF entry in instance network info cache for port d6685677-f121-4050-b030-28c2f7047497. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.341 2 DEBUG nova.network.neutron [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updating instance_info_cache with network_info: [{"id": "d6685677-f121-4050-b030-28c2f7047497", "address": "fa:16:3e:a7:c1:74", "network": {"id": "1572d154-338b-48bc-bb2d-34c8fe54b7b5", "bridge": "br-int", "label": "tempest-network-smoke--749348197", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6685677-f1", "ovs_interfaceid": "d6685677-f121-4050-b030-28c2f7047497", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "address": "fa:16:3e:bd:92:5e", "network": {"id": "a1d8eda1-499a-4d9d-af36-998257565133", "bridge": "br-int", "label": "tempest-network-smoke--542624776", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:febd:925e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc6c4bf42-05", "ovs_interfaceid": "c6c4bf42-05a9-44dd-bcf7-59b8014196a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.359 2 DEBUG oslo_concurrency.lockutils [req-be4bc8c5-00c3-494f-98bf-f1d59061a8d2 req-f875d82a-c25b-48c1-bb27-78df0d240fdc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-029e6753-0289-499a-82e9-f687cb1c3adc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.575 2 DEBUG nova.network.neutron [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.593 2 INFO nova.compute.manager [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Took 1.24 seconds to deallocate network for instance.#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.634 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.635 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.697 2 DEBUG nova.compute.provider_tree [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.717 2 DEBUG nova.scheduler.client.report [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.746 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.777 2 INFO nova.scheduler.client.report [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance 029e6753-0289-499a-82e9-f687cb1c3adc#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.871 2 DEBUG oslo_concurrency.lockutils [None req-cca46ce9-91cc-4163-b4f6-b4bc7669b33b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.947s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.959 2 DEBUG nova.compute.manager [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-unplugged-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.960 2 DEBUG oslo_concurrency.lockutils [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.960 2 DEBUG oslo_concurrency.lockutils [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.960 2 DEBUG oslo_concurrency.lockutils [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.960 2 DEBUG nova.compute.manager [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] No waiting events found dispatching network-vif-unplugged-d6685677-f121-4050-b030-28c2f7047497 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.960 2 WARNING nova.compute.manager [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received unexpected event network-vif-unplugged-d6685677-f121-4050-b030-28c2f7047497 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.961 2 DEBUG nova.compute.manager [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.961 2 DEBUG oslo_concurrency.lockutils [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.961 2 DEBUG oslo_concurrency.lockutils [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.961 2 DEBUG oslo_concurrency.lockutils [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.961 2 DEBUG nova.compute.manager [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] No waiting events found dispatching network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:48:58 np0005470441 nova_compute[192626]: 2025-10-04 05:48:58.962 2 WARNING nova.compute.manager [req-825e5d37-633d-488f-b879-f7cc79607376 req-1915ca7e-93c1-4a3a-8c09-9e02ac01fa78 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received unexpected event network-vif-plugged-d6685677-f121-4050-b030-28c2f7047497 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.594 2 DEBUG nova.compute.manager [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.595 2 DEBUG oslo_concurrency.lockutils [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.596 2 DEBUG oslo_concurrency.lockutils [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.596 2 DEBUG oslo_concurrency.lockutils [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "029e6753-0289-499a-82e9-f687cb1c3adc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.597 2 DEBUG nova.compute.manager [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] No waiting events found dispatching network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.597 2 WARNING nova.compute.manager [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received unexpected event network-vif-plugged-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.598 2 DEBUG nova.compute.manager [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-deleted-d6685677-f121-4050-b030-28c2f7047497 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:48:59 np0005470441 nova_compute[192626]: 2025-10-04 05:48:59.598 2 DEBUG nova.compute.manager [req-ae32bc32-d643-45db-b885-95626de61fc7 req-c8d9ab7a-8a3c-40ba-94a0-183e11b161ae 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Received event network-vif-deleted-c6c4bf42-05a9-44dd-bcf7-59b8014196a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:49:02 np0005470441 nova_compute[192626]: 2025-10-04 05:49:02.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:03 np0005470441 nova_compute[192626]: 2025-10-04 05:49:03.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:05 np0005470441 podman[232165]: 2025-10-04 05:49:05.308350637 +0000 UTC m=+0.059059260 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:49:05 np0005470441 podman[232166]: 2025-10-04 05:49:05.328358863 +0000 UTC m=+0.076901063 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:49:06 np0005470441 nova_compute[192626]: 2025-10-04 05:49:06.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:06.757 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:49:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:06.757 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:49:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:06.758 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:49:07 np0005470441 nova_compute[192626]: 2025-10-04 05:49:07.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:07 np0005470441 nova_compute[192626]: 2025-10-04 05:49:07.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:08 np0005470441 nova_compute[192626]: 2025-10-04 05:49:08.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:10 np0005470441 podman[232203]: 2025-10-04 05:49:10.317648324 +0000 UTC m=+0.067521203 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  4 01:49:10 np0005470441 podman[232204]: 2025-10-04 05:49:10.361748942 +0000 UTC m=+0.111864418 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:49:10 np0005470441 nova_compute[192626]: 2025-10-04 05:49:10.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:10 np0005470441 nova_compute[192626]: 2025-10-04 05:49:10.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.203 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759556937.20043, 029e6753-0289-499a-82e9-f687cb1c3adc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.204 2 INFO nova.compute.manager [-] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.234 2 DEBUG nova.compute.manager [None req-5a89fbbe-bc64-4702-abb6-7bfc674d47f0 - - - - - -] [instance: 029e6753-0289-499a-82e9-f687cb1c3adc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:49:12 np0005470441 nova_compute[192626]: 2025-10-04 05:49:12.733 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:49:13 np0005470441 nova_compute[192626]: 2025-10-04 05:49:13.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:13 np0005470441 nova_compute[192626]: 2025-10-04 05:49:13.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:15 np0005470441 nova_compute[192626]: 2025-10-04 05:49:15.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:16 np0005470441 podman[232241]: 2025-10-04 05:49:16.329096794 +0000 UTC m=+0.078728875 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.746 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.746 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.746 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.972 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.974 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5725MB free_disk=73.42045211791992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.974 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:49:16 np0005470441 nova_compute[192626]: 2025-10-04 05:49:16.974 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.060 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.061 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.091 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.121 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.158 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.159 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:49:17 np0005470441 nova_compute[192626]: 2025-10-04 05:49:17.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:18 np0005470441 nova_compute[192626]: 2025-10-04 05:49:18.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:19 np0005470441 nova_compute[192626]: 2025-10-04 05:49:19.154 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:19 np0005470441 nova_compute[192626]: 2025-10-04 05:49:19.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:20 np0005470441 nova_compute[192626]: 2025-10-04 05:49:20.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:49:22 np0005470441 nova_compute[192626]: 2025-10-04 05:49:22.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:23 np0005470441 podman[232263]: 2025-10-04 05:49:23.297713688 +0000 UTC m=+0.053966514 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:49:23 np0005470441 nova_compute[192626]: 2025-10-04 05:49:23.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:25 np0005470441 podman[232287]: 2025-10-04 05:49:25.335592025 +0000 UTC m=+0.075406089 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:49:27 np0005470441 nova_compute[192626]: 2025-10-04 05:49:27.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:27 np0005470441 podman[232305]: 2025-10-04 05:49:27.365243815 +0000 UTC m=+0.102228461 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 01:49:28 np0005470441 nova_compute[192626]: 2025-10-04 05:49:28.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:32 np0005470441 nova_compute[192626]: 2025-10-04 05:49:32.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:33 np0005470441 nova_compute[192626]: 2025-10-04 05:49:33.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.487 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:49:35 np0005470441 nova_compute[192626]: 2025-10-04 05:49:35.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.488 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.490 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.571 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:4c:5a 2001:db8:0:1:f816:3eff:feaa:4c5a 2001:db8::f816:3eff:feaa:4c5a'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feaa:4c5a/64 2001:db8::f816:3eff:feaa:4c5a/64', 'neutron:device_id': 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b2b68f0-e753-4450-950e-3f0a5cb64650, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2f1f60fd-9e9f-4c74-b21c-feb88a89e536) old=Port_Binding(mac=['fa:16:3e:aa:4c:5a 2001:db8::f816:3eff:feaa:4c5a'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feaa:4c5a/64', 'neutron:device_id': 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.573 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2f1f60fd-9e9f-4c74-b21c-feb88a89e536 in datapath 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc updated#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.574 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:49:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:49:35.575 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a82a4b67-80fc-46e2-9957-f0ed37a1e747]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:49:36 np0005470441 podman[232331]: 2025-10-04 05:49:36.309100116 +0000 UTC m=+0.062311794 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  4 01:49:36 np0005470441 podman[232332]: 2025-10-04 05:49:36.319928417 +0000 UTC m=+0.066489183 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:49:37 np0005470441 nova_compute[192626]: 2025-10-04 05:49:37.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:38 np0005470441 nova_compute[192626]: 2025-10-04 05:49:38.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:41 np0005470441 podman[232376]: 2025-10-04 05:49:41.345723389 +0000 UTC m=+0.085227233 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:49:41 np0005470441 podman[232375]: 2025-10-04 05:49:41.355530391 +0000 UTC m=+0.101811900 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:49:42 np0005470441 nova_compute[192626]: 2025-10-04 05:49:42.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:43 np0005470441 nova_compute[192626]: 2025-10-04 05:49:43.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:47 np0005470441 nova_compute[192626]: 2025-10-04 05:49:47.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:47 np0005470441 podman[232414]: 2025-10-04 05:49:47.328910799 +0000 UTC m=+0.073945498 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:49:48 np0005470441 nova_compute[192626]: 2025-10-04 05:49:48.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:52 np0005470441 nova_compute[192626]: 2025-10-04 05:49:52.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:53 np0005470441 nova_compute[192626]: 2025-10-04 05:49:53.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:54 np0005470441 podman[232437]: 2025-10-04 05:49:54.315418117 +0000 UTC m=+0.066365479 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:49:56 np0005470441 podman[232461]: 2025-10-04 05:49:56.303965635 +0000 UTC m=+0.052971244 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:49:56 np0005470441 ovn_controller[94840]: 2025-10-04T05:49:56Z|00332|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  4 01:49:57 np0005470441 nova_compute[192626]: 2025-10-04 05:49:57.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:58 np0005470441 nova_compute[192626]: 2025-10-04 05:49:58.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:49:58 np0005470441 podman[232480]: 2025-10-04 05:49:58.338156465 +0000 UTC m=+0.088529236 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:50:02 np0005470441 nova_compute[192626]: 2025-10-04 05:50:02.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.711 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:50:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:50:03 np0005470441 nova_compute[192626]: 2025-10-04 05:50:03.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:06.758 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:06.759 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:06.759 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:07 np0005470441 nova_compute[192626]: 2025-10-04 05:50:07.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:07 np0005470441 podman[232507]: 2025-10-04 05:50:07.315201341 +0000 UTC m=+0.059057300 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:50:07 np0005470441 podman[232506]: 2025-10-04 05:50:07.315465118 +0000 UTC m=+0.061029196 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:50:07 np0005470441 nova_compute[192626]: 2025-10-04 05:50:07.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:07 np0005470441 nova_compute[192626]: 2025-10-04 05:50:07.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:08 np0005470441 nova_compute[192626]: 2025-10-04 05:50:08.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:12 np0005470441 nova_compute[192626]: 2025-10-04 05:50:12.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:12 np0005470441 podman[232549]: 2025-10-04 05:50:12.296980846 +0000 UTC m=+0.048749304 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:50:12 np0005470441 podman[232548]: 2025-10-04 05:50:12.329314956 +0000 UTC m=+0.081891567 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  4 01:50:12 np0005470441 nova_compute[192626]: 2025-10-04 05:50:12.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:12 np0005470441 nova_compute[192626]: 2025-10-04 05:50:12.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:50:12 np0005470441 nova_compute[192626]: 2025-10-04 05:50:12.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:12 np0005470441 nova_compute[192626]: 2025-10-04 05:50:12.719 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 01:50:13 np0005470441 nova_compute[192626]: 2025-10-04 05:50:13.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:13 np0005470441 nova_compute[192626]: 2025-10-04 05:50:13.776 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:13 np0005470441 nova_compute[192626]: 2025-10-04 05:50:13.776 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:50:13 np0005470441 nova_compute[192626]: 2025-10-04 05:50:13.776 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:50:13 np0005470441 nova_compute[192626]: 2025-10-04 05:50:13.793 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:50:14 np0005470441 nova_compute[192626]: 2025-10-04 05:50:14.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:14 np0005470441 nova_compute[192626]: 2025-10-04 05:50:14.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:14 np0005470441 nova_compute[192626]: 2025-10-04 05:50:14.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 01:50:14 np0005470441 nova_compute[192626]: 2025-10-04 05:50:14.771 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 01:50:17 np0005470441 nova_compute[192626]: 2025-10-04 05:50:17.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:17 np0005470441 nova_compute[192626]: 2025-10-04 05:50:17.772 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:18 np0005470441 podman[232584]: 2025-10-04 05:50:18.303420564 +0000 UTC m=+0.053529701 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.754 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.754 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.755 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.755 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.970 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.971 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5762MB free_disk=73.42045211791992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.971 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:18 np0005470441 nova_compute[192626]: 2025-10-04 05:50:18.972 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:19 np0005470441 nova_compute[192626]: 2025-10-04 05:50:19.089 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:50:19 np0005470441 nova_compute[192626]: 2025-10-04 05:50:19.090 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:50:19 np0005470441 nova_compute[192626]: 2025-10-04 05:50:19.158 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:50:19 np0005470441 nova_compute[192626]: 2025-10-04 05:50:19.187 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:50:19 np0005470441 nova_compute[192626]: 2025-10-04 05:50:19.188 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:50:19 np0005470441 nova_compute[192626]: 2025-10-04 05:50:19.188 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:21 np0005470441 nova_compute[192626]: 2025-10-04 05:50:21.189 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:21 np0005470441 nova_compute[192626]: 2025-10-04 05:50:21.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.649 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.649 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.666 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.773 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.774 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.782 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.783 2 INFO nova.compute.claims [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.893 2 DEBUG nova.compute.provider_tree [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.909 2 DEBUG nova.scheduler.client.report [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.936 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.937 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.985 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:50:22 np0005470441 nova_compute[192626]: 2025-10-04 05:50:22.986 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.006 2 INFO nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.024 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.105 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.107 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.108 2 INFO nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Creating image(s)#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.109 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.110 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.111 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.140 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.219 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.221 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.222 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.253 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.279 2 DEBUG nova.policy [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.320 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.321 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.379 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.381 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.382 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.448 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.449 2 DEBUG nova.virt.disk.api [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.450 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.526 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.527 2 DEBUG nova.virt.disk.api [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.528 2 DEBUG nova.objects.instance [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid 89f82b75-7054-4ae1-8b6f-7e13cd7789cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.546 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.547 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Ensure instance console log exists: /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.547 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.547 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:23 np0005470441 nova_compute[192626]: 2025-10-04 05:50:23.548 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:24 np0005470441 nova_compute[192626]: 2025-10-04 05:50:24.042 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Successfully created port: 04bbe94a-5455-4c06-ba6c-4a5efffa9049 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:50:24 np0005470441 nova_compute[192626]: 2025-10-04 05:50:24.709 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Successfully created port: 75cd5dc5-982f-4aa2-9be0-bf6346669b8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:50:25 np0005470441 podman[232620]: 2025-10-04 05:50:25.311748641 +0000 UTC m=+0.056358942 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:50:25 np0005470441 nova_compute[192626]: 2025-10-04 05:50:25.754 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Successfully updated port: 04bbe94a-5455-4c06-ba6c-4a5efffa9049 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:50:25 np0005470441 nova_compute[192626]: 2025-10-04 05:50:25.857 2 DEBUG nova.compute.manager [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-changed-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:25 np0005470441 nova_compute[192626]: 2025-10-04 05:50:25.858 2 DEBUG nova.compute.manager [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing instance network info cache due to event network-changed-04bbe94a-5455-4c06-ba6c-4a5efffa9049. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:50:25 np0005470441 nova_compute[192626]: 2025-10-04 05:50:25.858 2 DEBUG oslo_concurrency.lockutils [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:50:25 np0005470441 nova_compute[192626]: 2025-10-04 05:50:25.858 2 DEBUG oslo_concurrency.lockutils [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:50:25 np0005470441 nova_compute[192626]: 2025-10-04 05:50:25.859 2 DEBUG nova.network.neutron [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing network info cache for port 04bbe94a-5455-4c06-ba6c-4a5efffa9049 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:50:26 np0005470441 nova_compute[192626]: 2025-10-04 05:50:26.091 2 DEBUG nova.network.neutron [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:50:26 np0005470441 nova_compute[192626]: 2025-10-04 05:50:26.785 2 DEBUG nova.network.neutron [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:50:26 np0005470441 nova_compute[192626]: 2025-10-04 05:50:26.808 2 DEBUG oslo_concurrency.lockutils [req-c676cb46-a48b-4c84-8f0c-2001bd95d366 req-1931d898-040a-4c96-937b-e7894b81babc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:50:26 np0005470441 nova_compute[192626]: 2025-10-04 05:50:26.986 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Successfully updated port: 75cd5dc5-982f-4aa2-9be0-bf6346669b8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.007 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.007 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.008 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.166 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:27 np0005470441 podman[232644]: 2025-10-04 05:50:27.309475553 +0000 UTC m=+0.061466599 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.935 2 DEBUG nova.compute.manager [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-changed-75cd5dc5-982f-4aa2-9be0-bf6346669b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.936 2 DEBUG nova.compute.manager [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing instance network info cache due to event network-changed-75cd5dc5-982f-4aa2-9be0-bf6346669b8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:50:27 np0005470441 nova_compute[192626]: 2025-10-04 05:50:27.936 2 DEBUG oslo_concurrency.lockutils [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:50:28 np0005470441 nova_compute[192626]: 2025-10-04 05:50:28.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:29 np0005470441 podman[232663]: 2025-10-04 05:50:29.345825957 +0000 UTC m=+0.094787547 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  4 01:50:30 np0005470441 nova_compute[192626]: 2025-10-04 05:50:30.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.365 2 DEBUG nova.network.neutron [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updating instance_info_cache with network_info: [{"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.386 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.387 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Instance network_info: |[{"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.388 2 DEBUG oslo_concurrency.lockutils [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.388 2 DEBUG nova.network.neutron [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing network info cache for port 75cd5dc5-982f-4aa2-9be0-bf6346669b8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.392 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Start _get_guest_xml network_info=[{"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.397 2 WARNING nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.407 2 DEBUG nova.virt.libvirt.host [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.408 2 DEBUG nova.virt.libvirt.host [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.412 2 DEBUG nova.virt.libvirt.host [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.413 2 DEBUG nova.virt.libvirt.host [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.414 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.414 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.415 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.415 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.415 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.416 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.416 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.416 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.416 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.417 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.417 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.417 2 DEBUG nova.virt.hardware [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.421 2 DEBUG nova.virt.libvirt.vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686939085',display_name='tempest-TestGettingAddress-server-1686939085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686939085',id=47,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwNgUjV29Z8kk7tmurNl6Lnqmt4iuIDq85t972pydtKRNI9SHw9UWpehsWIDKbnBnenYtTJQswMVUZ2p1zoOBNA2f4JZQsO/pwGoi8jQr1zcBvzhaTSWGieINfbfDmbnQ==',key_name='tempest-TestGettingAddress-381244630',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-pxlz40cu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:50:23Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=89f82b75-7054-4ae1-8b6f-7e13cd7789cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.422 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.422 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.423 2 DEBUG nova.virt.libvirt.vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686939085',display_name='tempest-TestGettingAddress-server-1686939085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686939085',id=47,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwNgUjV29Z8kk7tmurNl6Lnqmt4iuIDq85t972pydtKRNI9SHw9UWpehsWIDKbnBnenYtTJQswMVUZ2p1zoOBNA2f4JZQsO/pwGoi8jQr1zcBvzhaTSWGieINfbfDmbnQ==',key_name='tempest-TestGettingAddress-381244630',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-pxlz40cu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:50:23Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=89f82b75-7054-4ae1-8b6f-7e13cd7789cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.423 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.424 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.424 2 DEBUG nova.objects.instance [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 89f82b75-7054-4ae1-8b6f-7e13cd7789cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.439 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <uuid>89f82b75-7054-4ae1-8b6f-7e13cd7789cc</uuid>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <name>instance-0000002f</name>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-1686939085</nova:name>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:50:32</nova:creationTime>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:port uuid="04bbe94a-5455-4c06-ba6c-4a5efffa9049">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        <nova:port uuid="75cd5dc5-982f-4aa2-9be0-bf6346669b8b">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe44:9c4c" ipVersion="6"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe44:9c4c" ipVersion="6"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <entry name="serial">89f82b75-7054-4ae1-8b6f-7e13cd7789cc</entry>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <entry name="uuid">89f82b75-7054-4ae1-8b6f-7e13cd7789cc</entry>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.config"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:22:a8:28"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <target dev="tap04bbe94a-54"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:44:9c:4c"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <target dev="tap75cd5dc5-98"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/console.log" append="off"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:50:32 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:50:32 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:50:32 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:50:32 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.440 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Preparing to wait for external event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.441 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.441 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.441 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.441 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Preparing to wait for external event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.441 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.442 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.442 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.443 2 DEBUG nova.virt.libvirt.vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686939085',display_name='tempest-TestGettingAddress-server-1686939085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686939085',id=47,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwNgUjV29Z8kk7tmurNl6Lnqmt4iuIDq85t972pydtKRNI9SHw9UWpehsWIDKbnBnenYtTJQswMVUZ2p1zoOBNA2f4JZQsO/pwGoi8jQr1zcBvzhaTSWGieINfbfDmbnQ==',key_name='tempest-TestGettingAddress-381244630',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-pxlz40cu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:50:23Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=89f82b75-7054-4ae1-8b6f-7e13cd7789cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.443 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.444 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.444 2 DEBUG os_vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.445 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.445 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.448 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap04bbe94a-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.449 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap04bbe94a-54, col_values=(('external_ids', {'iface-id': '04bbe94a-5455-4c06-ba6c-4a5efffa9049', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:a8:28', 'vm-uuid': '89f82b75-7054-4ae1-8b6f-7e13cd7789cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 NetworkManager[51690]: <info>  [1759557032.4510] manager: (tap04bbe94a-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.457 2 INFO os_vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54')#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.458 2 DEBUG nova.virt.libvirt.vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686939085',display_name='tempest-TestGettingAddress-server-1686939085',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686939085',id=47,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwNgUjV29Z8kk7tmurNl6Lnqmt4iuIDq85t972pydtKRNI9SHw9UWpehsWIDKbnBnenYtTJQswMVUZ2p1zoOBNA2f4JZQsO/pwGoi8jQr1zcBvzhaTSWGieINfbfDmbnQ==',key_name='tempest-TestGettingAddress-381244630',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-pxlz40cu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:50:23Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=89f82b75-7054-4ae1-8b6f-7e13cd7789cc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.459 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.459 2 DEBUG nova.network.os_vif_util [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.460 2 DEBUG os_vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.463 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap75cd5dc5-98, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.464 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap75cd5dc5-98, col_values=(('external_ids', {'iface-id': '75cd5dc5-982f-4aa2-9be0-bf6346669b8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:44:9c:4c', 'vm-uuid': '89f82b75-7054-4ae1-8b6f-7e13cd7789cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:32 np0005470441 NetworkManager[51690]: <info>  [1759557032.4665] manager: (tap75cd5dc5-98): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/158)
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.474 2 INFO os_vif [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98')#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.527 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.528 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.528 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:22:a8:28, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.528 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:44:9c:4c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:50:32 np0005470441 nova_compute[192626]: 2025-10-04 05:50:32.529 2 INFO nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Using config drive#033[00m
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.468 2 INFO nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Creating config drive at /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.config#033[00m
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.473 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwq4l73f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.601 2 DEBUG oslo_concurrency.processutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjwq4l73f" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.6756] manager: (tap04bbe94a-54): new Tun device (/org/freedesktop/NetworkManager/Devices/159)
Oct  4 01:50:33 np0005470441 kernel: tap04bbe94a-54: entered promiscuous mode
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00333|binding|INFO|Claiming lport 04bbe94a-5455-4c06-ba6c-4a5efffa9049 for this chassis.
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00334|binding|INFO|04bbe94a-5455-4c06-ba6c-4a5efffa9049: Claiming fa:16:3e:22:a8:28 10.100.0.5
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.6917] manager: (tap75cd5dc5-98): new Tun device (/org/freedesktop/NetworkManager/Devices/160)
Oct  4 01:50:33 np0005470441 kernel: tap75cd5dc5-98: entered promiscuous mode
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00335|if_status|INFO|Not updating pb chassis for 75cd5dc5-982f-4aa2-9be0-bf6346669b8b now as sb is readonly
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.6985] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.6991] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 systemd-udevd[232713]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:50:33 np0005470441 systemd-udevd[232714]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.713 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:a8:28 10.100.0.5'], port_security=['fa:16:3e:22:a8:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '89f82b75-7054-4ae1-8b6f-7e13cd7789cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c845a604-670f-43c9-8bbb-844bb6410caa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10d4b4c0-3a9c-417c-9b51-c2023e31f023, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=04bbe94a-5455-4c06-ba6c-4a5efffa9049) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.714 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 04bbe94a-5455-4c06-ba6c-4a5efffa9049 in datapath 114f6534-09ca-4cdb-b1a7-f666738e5e43 bound to our chassis#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.715 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 114f6534-09ca-4cdb-b1a7-f666738e5e43#033[00m
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.7267] device (tap75cd5dc5-98): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.7275] device (tap75cd5dc5-98): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.727 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e087512c-c6a7-4ee7-9a02-45b493a096d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.728 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap114f6534-01 in ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.7302] device (tap04bbe94a-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.7307] device (tap04bbe94a-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.731 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap114f6534-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.731 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b8cfa25f-55a5-453f-a07e-87494cd2a72b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.732 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e3991acf-64fe-4f38-8648-b411e6d296a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.743 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[a36fc6af-0eb0-408d-9db5-5c5d12d5fd82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 systemd-machined[152624]: New machine qemu-26-instance-0000002f.
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.769 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[88f4b81f-c097-4a8a-94a2-7425b9a1137e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 systemd[1]: Started Virtual Machine qemu-26-instance-0000002f.
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00336|binding|INFO|Claiming lport 75cd5dc5-982f-4aa2-9be0-bf6346669b8b for this chassis.
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00337|binding|INFO|75cd5dc5-982f-4aa2-9be0-bf6346669b8b: Claiming fa:16:3e:44:9c:4c 2001:db8:0:1:f816:3eff:fe44:9c4c 2001:db8::f816:3eff:fe44:9c4c
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.802 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[46139a68-f7a8-47b9-8bce-ef44e0197df5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00338|binding|INFO|Setting lport 04bbe94a-5455-4c06-ba6c-4a5efffa9049 ovn-installed in OVS
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00339|binding|INFO|Setting lport 04bbe94a-5455-4c06-ba6c-4a5efffa9049 up in Southbound
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.8192] manager: (tap114f6534-00): new Veth device (/org/freedesktop/NetworkManager/Devices/163)
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.818 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9c:4c 2001:db8:0:1:f816:3eff:fe44:9c4c 2001:db8::f816:3eff:fe44:9c4c'], port_security=['fa:16:3e:44:9c:4c 2001:db8:0:1:f816:3eff:fe44:9c4c 2001:db8::f816:3eff:fe44:9c4c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe44:9c4c/64 2001:db8::f816:3eff:fe44:9c4c/64', 'neutron:device_id': '89f82b75-7054-4ae1-8b6f-7e13cd7789cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c845a604-670f-43c9-8bbb-844bb6410caa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b2b68f0-e753-4450-950e-3f0a5cb64650, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=75cd5dc5-982f-4aa2-9be0-bf6346669b8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.817 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ba73e814-57f6-41c9-9c7b-3ab8309ddf5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00340|binding|INFO|Setting lport 75cd5dc5-982f-4aa2-9be0-bf6346669b8b ovn-installed in OVS
Oct  4 01:50:33 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:33Z|00341|binding|INFO|Setting lport 75cd5dc5-982f-4aa2-9be0-bf6346669b8b up in Southbound
Oct  4 01:50:33 np0005470441 nova_compute[192626]: 2025-10-04 05:50:33.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.851 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ed892d3b-1ba1-419e-b90f-35972afd5d15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.854 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[fcc67d49-69ca-4130-b74f-026014d74823]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 NetworkManager[51690]: <info>  [1759557033.8772] device (tap114f6534-00): carrier: link connected
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.882 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[185d2917-f400-4365-b3cd-d236bcf42511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.896 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9637b15f-88d8-4456-86d3-eadea623c7e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap114f6534-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:78:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493153, 'reachable_time': 41952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232750, 'error': None, 'target': 'ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.909 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[895fc381-43d1-4936-afec-fd82371d2c29]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef6:783e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493153, 'tstamp': 493153}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232751, 'error': None, 'target': 'ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.926 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f233c133-3b4d-4f60-8690-f4c6c0d0524c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap114f6534-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f6:78:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493153, 'reachable_time': 41952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232752, 'error': None, 'target': 'ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:33.954 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a103fa-362b-4fa3-9c8f-1d884d36a7bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.011 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[159695fd-4585-4b13-8e31-5d18dc84bfdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.014 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap114f6534-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.015 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.016 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap114f6534-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:34 np0005470441 NetworkManager[51690]: <info>  [1759557034.0194] manager: (tap114f6534-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/164)
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:34 np0005470441 kernel: tap114f6534-00: entered promiscuous mode
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.023 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap114f6534-00, col_values=(('external_ids', {'iface-id': 'f451af46-e3bd-475c-acae-5781881718f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:34Z|00342|binding|INFO|Releasing lport f451af46-e3bd-475c-acae-5781881718f0 from this chassis (sb_readonly=0)
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.036 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/114f6534-09ca-4cdb-b1a7-f666738e5e43.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/114f6534-09ca-4cdb-b1a7-f666738e5e43.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.037 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2df90ee6-4187-42ee-b39c-fea8134e2d07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.038 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-114f6534-09ca-4cdb-b1a7-f666738e5e43
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/114f6534-09ca-4cdb-b1a7-f666738e5e43.pid.haproxy
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 114f6534-09ca-4cdb-b1a7-f666738e5e43
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.038 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'env', 'PROCESS_TAG=haproxy-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/114f6534-09ca-4cdb-b1a7-f666738e5e43.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:50:34 np0005470441 podman[232792]: 2025-10-04 05:50:34.442917309 +0000 UTC m=+0.072023913 container create 57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  4 01:50:34 np0005470441 systemd[1]: Started libpod-conmon-57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0.scope.
Oct  4 01:50:34 np0005470441 podman[232792]: 2025-10-04 05:50:34.402392563 +0000 UTC m=+0.031499217 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.495 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557034.4944944, 89f82b75-7054-4ae1-8b6f-7e13cd7789cc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.496 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] VM Started (Lifecycle Event)#033[00m
Oct  4 01:50:34 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:50:34 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1245d9b975c652c45d0906dc161aa2c07821a3d008fe67b3f76c618c724469a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.517 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:50:34 np0005470441 podman[232792]: 2025-10-04 05:50:34.51944281 +0000 UTC m=+0.148549434 container init 57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.521 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557034.4959986, 89f82b75-7054-4ae1-8b6f-7e13cd7789cc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.521 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:50:34 np0005470441 podman[232792]: 2025-10-04 05:50:34.525222416 +0000 UTC m=+0.154329020 container start 57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.529 2 DEBUG nova.compute.manager [req-85eff06b-d808-4b44-98e6-1d0c2d02abaf req-a21b6199-2bd3-4b73-a04c-590641103560 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.530 2 DEBUG oslo_concurrency.lockutils [req-85eff06b-d808-4b44-98e6-1d0c2d02abaf req-a21b6199-2bd3-4b73-a04c-590641103560 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.530 2 DEBUG oslo_concurrency.lockutils [req-85eff06b-d808-4b44-98e6-1d0c2d02abaf req-a21b6199-2bd3-4b73-a04c-590641103560 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.531 2 DEBUG oslo_concurrency.lockutils [req-85eff06b-d808-4b44-98e6-1d0c2d02abaf req-a21b6199-2bd3-4b73-a04c-590641103560 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.531 2 DEBUG nova.compute.manager [req-85eff06b-d808-4b44-98e6-1d0c2d02abaf req-a21b6199-2bd3-4b73-a04c-590641103560 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Processing event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.543 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.547 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:50:34 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [NOTICE]   (232811) : New worker (232813) forked
Oct  4 01:50:34 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [NOTICE]   (232811) : Loading success.
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.578 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.616 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 75cd5dc5-982f-4aa2-9be0-bf6346669b8b in datapath 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc unbound from our chassis#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.618 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.635 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8ba519-31d3-4aa0-87d2-38d63a242009]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.636 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap471b274e-01 in ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.638 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap471b274e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.638 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f823015-eb33-40ab-aa91-4e7db775631e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.640 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1d834bb3-3109-4264-910d-674902fe91c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.651 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9544c8-c64b-4bce-9720-08a005f5fe91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.664 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[eb6959b5-217a-4eb0-995c-33f4088f7df5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.700 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[580cc09d-c0a0-43dc-8ca9-011acc1ef30f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 NetworkManager[51690]: <info>  [1759557034.7076] manager: (tap471b274e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/165)
Oct  4 01:50:34 np0005470441 systemd-udevd[232735]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.708 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a92e6b42-25a8-42ab-8ad8-3a6653097f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.745 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a7346b52-d428-472a-9776-d804924d7268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.750 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[d684bf67-3830-4144-9edd-af1f83b7ef6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 NetworkManager[51690]: <info>  [1759557034.7772] device (tap471b274e-00): carrier: link connected
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.782 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c0c88bf8-fa8c-4f99-b0a4-f0ef9e31e792]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.798 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5e18dffa-04bb-4268-9cbf-c3c76f2cf1b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap471b274e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:4c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493243, 'reachable_time': 39554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232832, 'error': None, 'target': 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.815 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2649b8b7-db64-4c94-9aab-ac48504e162a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaa:4c5a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493243, 'tstamp': 493243}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232833, 'error': None, 'target': 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.833 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[750a7dca-e419-4fbd-afa1-8822d2391556]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap471b274e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:aa:4c:5a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 93], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493243, 'reachable_time': 39554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232834, 'error': None, 'target': 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.867 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[62674364-8ffa-4f4e-ae78-ce29079531c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.904 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c88a7b6b-9d3e-460e-a699-08ef11be75b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.905 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap471b274e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.906 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.906 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap471b274e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:34 np0005470441 NetworkManager[51690]: <info>  [1759557034.9097] manager: (tap471b274e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Oct  4 01:50:34 np0005470441 kernel: tap471b274e-00: entered promiscuous mode
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.911 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap471b274e-00, col_values=(('external_ids', {'iface-id': '2f1f60fd-9e9f-4c74-b21c-feb88a89e536'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:34Z|00343|binding|INFO|Releasing lport 2f1f60fd-9e9f-4c74-b21c-feb88a89e536 from this chassis (sb_readonly=0)
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.914 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/471b274e-0e71-4d3c-ac3e-42e8f82f9dbc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/471b274e-0e71-4d3c-ac3e-42e8f82f9dbc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.915 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7e07c83f-a00f-4ebe-9c19-9bc1f2dd802c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.916 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/471b274e-0e71-4d3c-ac3e-42e8f82f9dbc.pid.haproxy
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:50:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:34.917 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'env', 'PROCESS_TAG=haproxy-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/471b274e-0e71-4d3c-ac3e-42e8f82f9dbc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:50:34 np0005470441 nova_compute[192626]: 2025-10-04 05:50:34.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:35 np0005470441 podman[232864]: 2025-10-04 05:50:35.264536702 +0000 UTC m=+0.050514334 container create ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:50:35 np0005470441 systemd[1]: Started libpod-conmon-ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af.scope.
Oct  4 01:50:35 np0005470441 podman[232864]: 2025-10-04 05:50:35.237250497 +0000 UTC m=+0.023228179 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:50:35 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:50:35 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2437930ac04ee50c7d5daa0f5dd094ae0f83febf13528a1da31993bcce7f062/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:50:35 np0005470441 podman[232864]: 2025-10-04 05:50:35.350730631 +0000 UTC m=+0.136708293 container init ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:50:35 np0005470441 podman[232864]: 2025-10-04 05:50:35.35623702 +0000 UTC m=+0.142214652 container start ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  4 01:50:35 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [NOTICE]   (232883) : New worker (232885) forked
Oct  4 01:50:35 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [NOTICE]   (232883) : Loading success.
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.565 2 DEBUG nova.network.neutron [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updated VIF entry in instance network info cache for port 75cd5dc5-982f-4aa2-9be0-bf6346669b8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.566 2 DEBUG nova.network.neutron [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updating instance_info_cache with network_info: [{"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.586 2 DEBUG oslo_concurrency.lockutils [req-ef5360bd-4a67-4b4f-82c3-ae375b60933d req-9a23e5d7-8831-4379-928e-08c4a7215965 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:36.600 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:50:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:36.602 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.666 2 DEBUG nova.compute.manager [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.667 2 DEBUG oslo_concurrency.lockutils [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.668 2 DEBUG oslo_concurrency.lockutils [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.668 2 DEBUG oslo_concurrency.lockutils [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.668 2 DEBUG nova.compute.manager [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] No event matching network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 in dict_keys([('network-vif-plugged', '75cd5dc5-982f-4aa2-9be0-bf6346669b8b')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.669 2 WARNING nova.compute.manager [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received unexpected event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 for instance with vm_state building and task_state spawning.#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.669 2 DEBUG nova.compute.manager [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.670 2 DEBUG oslo_concurrency.lockutils [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.670 2 DEBUG oslo_concurrency.lockutils [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.671 2 DEBUG oslo_concurrency.lockutils [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.671 2 DEBUG nova.compute.manager [req-1a1c974b-e59a-4c27-b903-c4d7b7278915 req-eee34c85-eb35-49ba-b7d9-dcaacb7e9387 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Processing event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.672 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.678 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557036.6778762, 89f82b75-7054-4ae1-8b6f-7e13cd7789cc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.678 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.682 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.688 2 INFO nova.virt.libvirt.driver [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Instance spawned successfully.#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.688 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.707 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.719 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.722 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.722 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.723 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.723 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.724 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.724 2 DEBUG nova.virt.libvirt.driver [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.736 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.800 2 INFO nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Took 13.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.801 2 DEBUG nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.876 2 INFO nova.compute.manager [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Took 14.16 seconds to build instance.#033[00m
Oct  4 01:50:36 np0005470441 nova_compute[192626]: 2025-10-04 05:50:36.900 2 DEBUG oslo_concurrency.lockutils [None req-d40a05f5-d031-45e6-8e46-a6bd92e8e3c4 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:37 np0005470441 nova_compute[192626]: 2025-10-04 05:50:37.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:38 np0005470441 podman[232895]: 2025-10-04 05:50:38.322393889 +0000 UTC m=+0.070801538 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:50:38 np0005470441 podman[232894]: 2025-10-04 05:50:38.323770978 +0000 UTC m=+0.076409139 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:50:38 np0005470441 nova_compute[192626]: 2025-10-04 05:50:38.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:38 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:50:38.605 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:50:39 np0005470441 nova_compute[192626]: 2025-10-04 05:50:39.080 2 DEBUG nova.compute.manager [req-d260b36a-ba31-46f3-9627-a50d44714bb5 req-fd99ec24-2c30-4bd0-8ca2-b9a4c82634b7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:39 np0005470441 nova_compute[192626]: 2025-10-04 05:50:39.080 2 DEBUG oslo_concurrency.lockutils [req-d260b36a-ba31-46f3-9627-a50d44714bb5 req-fd99ec24-2c30-4bd0-8ca2-b9a4c82634b7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:50:39 np0005470441 nova_compute[192626]: 2025-10-04 05:50:39.081 2 DEBUG oslo_concurrency.lockutils [req-d260b36a-ba31-46f3-9627-a50d44714bb5 req-fd99ec24-2c30-4bd0-8ca2-b9a4c82634b7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:50:39 np0005470441 nova_compute[192626]: 2025-10-04 05:50:39.081 2 DEBUG oslo_concurrency.lockutils [req-d260b36a-ba31-46f3-9627-a50d44714bb5 req-fd99ec24-2c30-4bd0-8ca2-b9a4c82634b7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:50:39 np0005470441 nova_compute[192626]: 2025-10-04 05:50:39.081 2 DEBUG nova.compute.manager [req-d260b36a-ba31-46f3-9627-a50d44714bb5 req-fd99ec24-2c30-4bd0-8ca2-b9a4c82634b7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] No waiting events found dispatching network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:50:39 np0005470441 nova_compute[192626]: 2025-10-04 05:50:39.081 2 WARNING nova.compute.manager [req-d260b36a-ba31-46f3-9627-a50d44714bb5 req-fd99ec24-2c30-4bd0-8ca2-b9a4c82634b7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received unexpected event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b for instance with vm_state active and task_state None.#033[00m
Oct  4 01:50:41 np0005470441 nova_compute[192626]: 2025-10-04 05:50:41.493 2 DEBUG nova.compute.manager [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-changed-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:50:41 np0005470441 nova_compute[192626]: 2025-10-04 05:50:41.494 2 DEBUG nova.compute.manager [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing instance network info cache due to event network-changed-04bbe94a-5455-4c06-ba6c-4a5efffa9049. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:50:41 np0005470441 nova_compute[192626]: 2025-10-04 05:50:41.494 2 DEBUG oslo_concurrency.lockutils [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:50:41 np0005470441 nova_compute[192626]: 2025-10-04 05:50:41.494 2 DEBUG oslo_concurrency.lockutils [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:50:41 np0005470441 nova_compute[192626]: 2025-10-04 05:50:41.494 2 DEBUG nova.network.neutron [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing network info cache for port 04bbe94a-5455-4c06-ba6c-4a5efffa9049 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:50:42 np0005470441 nova_compute[192626]: 2025-10-04 05:50:42.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:42 np0005470441 nova_compute[192626]: 2025-10-04 05:50:42.650 2 DEBUG nova.network.neutron [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updated VIF entry in instance network info cache for port 04bbe94a-5455-4c06-ba6c-4a5efffa9049. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:50:42 np0005470441 nova_compute[192626]: 2025-10-04 05:50:42.651 2 DEBUG nova.network.neutron [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updating instance_info_cache with network_info: [{"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:50:42 np0005470441 nova_compute[192626]: 2025-10-04 05:50:42.669 2 DEBUG oslo_concurrency.lockutils [req-f90b75a6-28b6-4f72-840f-6383fb00b916 req-6da92656-2c93-4207-b853-dca0848f5768 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:50:43 np0005470441 podman[232939]: 2025-10-04 05:50:43.321670198 +0000 UTC m=+0.067916965 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:50:43 np0005470441 podman[232940]: 2025-10-04 05:50:43.329287847 +0000 UTC m=+0.068690807 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  4 01:50:43 np0005470441 nova_compute[192626]: 2025-10-04 05:50:43.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:47 np0005470441 nova_compute[192626]: 2025-10-04 05:50:47.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:48 np0005470441 nova_compute[192626]: 2025-10-04 05:50:48.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:49 np0005470441 podman[232990]: 2025-10-04 05:50:49.302954403 +0000 UTC m=+0.052298876 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6)
Oct  4 01:50:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:50Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:a8:28 10.100.0.5
Oct  4 01:50:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:50:50Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:a8:28 10.100.0.5
Oct  4 01:50:52 np0005470441 nova_compute[192626]: 2025-10-04 05:50:52.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:53 np0005470441 nova_compute[192626]: 2025-10-04 05:50:53.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:56 np0005470441 podman[233012]: 2025-10-04 05:50:56.305723189 +0000 UTC m=+0.056019012 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:50:57 np0005470441 nova_compute[192626]: 2025-10-04 05:50:57.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:50:58 np0005470441 podman[233036]: 2025-10-04 05:50:58.08674354 +0000 UTC m=+0.084645936 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  4 01:50:58 np0005470441 nova_compute[192626]: 2025-10-04 05:50:58.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 podman[233055]: 2025-10-04 05:51:00.34657972 +0000 UTC m=+0.098226176 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.706 2 DEBUG nova.compute.manager [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-changed-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.706 2 DEBUG nova.compute.manager [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing instance network info cache due to event network-changed-04bbe94a-5455-4c06-ba6c-4a5efffa9049. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.707 2 DEBUG oslo_concurrency.lockutils [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.708 2 DEBUG oslo_concurrency.lockutils [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.708 2 DEBUG nova.network.neutron [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Refreshing network info cache for port 04bbe94a-5455-4c06-ba6c-4a5efffa9049 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.799 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.800 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.800 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.800 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.800 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.801 2 INFO nova.compute.manager [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Terminating instance#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.802 2 DEBUG nova.compute.manager [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:51:00 np0005470441 kernel: tap04bbe94a-54 (unregistering): left promiscuous mode
Oct  4 01:51:00 np0005470441 NetworkManager[51690]: <info>  [1759557060.8287] device (tap04bbe94a-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:51:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:51:00Z|00344|binding|INFO|Releasing lport 04bbe94a-5455-4c06-ba6c-4a5efffa9049 from this chassis (sb_readonly=0)
Oct  4 01:51:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:51:00Z|00345|binding|INFO|Setting lport 04bbe94a-5455-4c06-ba6c-4a5efffa9049 down in Southbound
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:51:00Z|00346|binding|INFO|Removing iface tap04bbe94a-54 ovn-installed in OVS
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:00.848 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:a8:28 10.100.0.5'], port_security=['fa:16:3e:22:a8:28 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '89f82b75-7054-4ae1-8b6f-7e13cd7789cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c845a604-670f-43c9-8bbb-844bb6410caa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=10d4b4c0-3a9c-417c-9b51-c2023e31f023, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=04bbe94a-5455-4c06-ba6c-4a5efffa9049) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:51:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:00.850 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 04bbe94a-5455-4c06-ba6c-4a5efffa9049 in datapath 114f6534-09ca-4cdb-b1a7-f666738e5e43 unbound from our chassis#033[00m
Oct  4 01:51:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:00.851 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 114f6534-09ca-4cdb-b1a7-f666738e5e43, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:00.852 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ce03c167-f35c-4094-9d94-77c622310241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:00.853 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43 namespace which is not needed anymore#033[00m
Oct  4 01:51:00 np0005470441 kernel: tap75cd5dc5-98 (unregistering): left promiscuous mode
Oct  4 01:51:00 np0005470441 NetworkManager[51690]: <info>  [1759557060.8814] device (tap75cd5dc5-98): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:51:00Z|00347|binding|INFO|Releasing lport 75cd5dc5-982f-4aa2-9be0-bf6346669b8b from this chassis (sb_readonly=0)
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:51:00Z|00348|binding|INFO|Setting lport 75cd5dc5-982f-4aa2-9be0-bf6346669b8b down in Southbound
Oct  4 01:51:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:51:00Z|00349|binding|INFO|Removing iface tap75cd5dc5-98 ovn-installed in OVS
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:00 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:00.950 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:44:9c:4c 2001:db8:0:1:f816:3eff:fe44:9c4c 2001:db8::f816:3eff:fe44:9c4c'], port_security=['fa:16:3e:44:9c:4c 2001:db8:0:1:f816:3eff:fe44:9c4c 2001:db8::f816:3eff:fe44:9c4c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe44:9c4c/64 2001:db8::f816:3eff:fe44:9c4c/64', 'neutron:device_id': '89f82b75-7054-4ae1-8b6f-7e13cd7789cc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c845a604-670f-43c9-8bbb-844bb6410caa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6b2b68f0-e753-4450-950e-3f0a5cb64650, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=75cd5dc5-982f-4aa2-9be0-bf6346669b8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:51:00 np0005470441 nova_compute[192626]: 2025-10-04 05:51:00.962 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [NOTICE]   (232811) : haproxy version is 2.8.14-c23fe91
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [NOTICE]   (232811) : path to executable is /usr/sbin/haproxy
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [WARNING]  (232811) : Exiting Master process...
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [ALERT]    (232811) : Current worker (232813) exited with code 143 (Terminated)
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43[232807]: [WARNING]  (232811) : All workers exited. Exiting... (0)
Oct  4 01:51:01 np0005470441 systemd[1]: libpod-57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0.scope: Deactivated successfully.
Oct  4 01:51:01 np0005470441 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Oct  4 01:51:01 np0005470441 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000002f.scope: Consumed 14.325s CPU time.
Oct  4 01:51:01 np0005470441 systemd-machined[152624]: Machine qemu-26-instance-0000002f terminated.
Oct  4 01:51:01 np0005470441 podman[233112]: 2025-10-04 05:51:01.012490694 +0000 UTC m=+0.071344313 container died 57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:51:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0-userdata-shm.mount: Deactivated successfully.
Oct  4 01:51:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay-1245d9b975c652c45d0906dc161aa2c07821a3d008fe67b3f76c618c724469a5-merged.mount: Deactivated successfully.
Oct  4 01:51:01 np0005470441 podman[233112]: 2025-10-04 05:51:01.061986517 +0000 UTC m=+0.120840106 container cleanup 57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  4 01:51:01 np0005470441 systemd[1]: libpod-conmon-57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0.scope: Deactivated successfully.
Oct  4 01:51:01 np0005470441 podman[233142]: 2025-10-04 05:51:01.129408937 +0000 UTC m=+0.045511600 container remove 57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.135 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[29e94bc5-1fde-49fc-aacc-91444826ee90]: (4, ('Sat Oct  4 05:51:00 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43 (57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0)\n57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0\nSat Oct  4 05:51:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43 (57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0)\n57e5f8aa1838f90d4ad5351b2b80a40952c9e8de3125517454a26b60889c44e0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.137 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[91b05c22-b57d-4b96-a9f0-cac99b20ebca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.138 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap114f6534-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.140 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 kernel: tap114f6534-00: left promiscuous mode
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.160 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[93abb068-fad0-4254-84b7-88b6e4b03129]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.189 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[496f293e-d361-4417-b4c7-101e6227748d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.191 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[33c37f40-535a-469a-b026-a43b5b4de68b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.208 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3a7fa7-ef99-4ff7-9461-0293960fa9f7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493145, 'reachable_time': 39101, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233165, 'error': None, 'target': 'ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 systemd[1]: run-netns-ovnmeta\x2d114f6534\x2d09ca\x2d4cdb\x2db1a7\x2df666738e5e43.mount: Deactivated successfully.
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.210 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-114f6534-09ca-4cdb-b1a7-f666738e5e43 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.211 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[795dd0a0-68cd-4274-a35d-0f6609026f68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.212 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 75cd5dc5-982f-4aa2-9be0-bf6346669b8b in datapath 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc unbound from our chassis#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.214 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.215 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c18b85-91f7-4515-bcba-5a7b9f1f83db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.215 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc namespace which is not needed anymore#033[00m
Oct  4 01:51:01 np0005470441 NetworkManager[51690]: <info>  [1759557061.2323] manager: (tap75cd5dc5-98): new Tun device (/org/freedesktop/NetworkManager/Devices/167)
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.282 2 INFO nova.virt.libvirt.driver [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Instance destroyed successfully.#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.282 2 DEBUG nova.objects.instance [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid 89f82b75-7054-4ae1-8b6f-7e13cd7789cc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.298 2 DEBUG nova.virt.libvirt.vif [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686939085',display_name='tempest-TestGettingAddress-server-1686939085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686939085',id=47,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwNgUjV29Z8kk7tmurNl6Lnqmt4iuIDq85t972pydtKRNI9SHw9UWpehsWIDKbnBnenYtTJQswMVUZ2p1zoOBNA2f4JZQsO/pwGoi8jQr1zcBvzhaTSWGieINfbfDmbnQ==',key_name='tempest-TestGettingAddress-381244630',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:50:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-pxlz40cu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:50:36Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=89f82b75-7054-4ae1-8b6f-7e13cd7789cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.298 2 DEBUG nova.network.os_vif_util [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.215", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.299 2 DEBUG nova.network.os_vif_util [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.299 2 DEBUG os_vif [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.302 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap04bbe94a-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.310 2 INFO os_vif [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:a8:28,bridge_name='br-int',has_traffic_filtering=True,id=04bbe94a-5455-4c06-ba6c-4a5efffa9049,network=Network(114f6534-09ca-4cdb-b1a7-f666738e5e43),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap04bbe94a-54')#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.311 2 DEBUG nova.virt.libvirt.vif [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686939085',display_name='tempest-TestGettingAddress-server-1686939085',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686939085',id=47,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKwNgUjV29Z8kk7tmurNl6Lnqmt4iuIDq85t972pydtKRNI9SHw9UWpehsWIDKbnBnenYtTJQswMVUZ2p1zoOBNA2f4JZQsO/pwGoi8jQr1zcBvzhaTSWGieINfbfDmbnQ==',key_name='tempest-TestGettingAddress-381244630',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:50:36Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-pxlz40cu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:50:36Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=89f82b75-7054-4ae1-8b6f-7e13cd7789cc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.312 2 DEBUG nova.network.os_vif_util [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.313 2 DEBUG nova.network.os_vif_util [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.313 2 DEBUG os_vif [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.315 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap75cd5dc5-98, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.319 2 INFO os_vif [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:44:9c:4c,bridge_name='br-int',has_traffic_filtering=True,id=75cd5dc5-982f-4aa2-9be0-bf6346669b8b,network=Network(471b274e-0e71-4d3c-ac3e-42e8f82f9dbc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap75cd5dc5-98')#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.320 2 INFO nova.virt.libvirt.driver [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Deleting instance files /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc_del#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.320 2 INFO nova.virt.libvirt.driver [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Deletion of /var/lib/nova/instances/89f82b75-7054-4ae1-8b6f-7e13cd7789cc_del complete#033[00m
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [NOTICE]   (232883) : haproxy version is 2.8.14-c23fe91
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [NOTICE]   (232883) : path to executable is /usr/sbin/haproxy
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [WARNING]  (232883) : Exiting Master process...
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [WARNING]  (232883) : Exiting Master process...
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [ALERT]    (232883) : Current worker (232885) exited with code 143 (Terminated)
Oct  4 01:51:01 np0005470441 neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc[232879]: [WARNING]  (232883) : All workers exited. Exiting... (0)
Oct  4 01:51:01 np0005470441 systemd[1]: libpod-ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af.scope: Deactivated successfully.
Oct  4 01:51:01 np0005470441 podman[233213]: 2025-10-04 05:51:01.340498739 +0000 UTC m=+0.040953089 container died ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  4 01:51:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af-userdata-shm.mount: Deactivated successfully.
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.373 2 INFO nova.compute.manager [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.374 2 DEBUG oslo.service.loopingcall [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.374 2 DEBUG nova.compute.manager [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.374 2 DEBUG nova.network.neutron [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:51:01 np0005470441 systemd[1]: var-lib-containers-storage-overlay-b2437930ac04ee50c7d5daa0f5dd094ae0f83febf13528a1da31993bcce7f062-merged.mount: Deactivated successfully.
Oct  4 01:51:01 np0005470441 podman[233213]: 2025-10-04 05:51:01.387088389 +0000 UTC m=+0.087542749 container cleanup ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3)
Oct  4 01:51:01 np0005470441 systemd[1]: libpod-conmon-ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af.scope: Deactivated successfully.
Oct  4 01:51:01 np0005470441 podman[233244]: 2025-10-04 05:51:01.451910553 +0000 UTC m=+0.043266615 container remove ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.457 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[adef731e-80eb-45a5-b599-375a6014b0fc]: (4, ('Sat Oct  4 05:51:01 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc (ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af)\ned3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af\nSat Oct  4 05:51:01 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc (ed3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af)\ned3c54debe1a15e1b31c8676a3e8b777b6b146ffa17fd1a6ecee8579d92721af\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.459 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d5628243-e860-49c5-b7c4-4e4f6371d31f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.460 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap471b274e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 kernel: tap471b274e-00: left promiscuous mode
Oct  4 01:51:01 np0005470441 nova_compute[192626]: 2025-10-04 05:51:01.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.479 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[23bc18f9-46ba-4b0e-aa68-f603c8766655]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.512 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f9758a37-d3c3-4446-9e55-e76a55714077]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.513 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[910612f9-6e6a-418e-96d6-936d16eb4085]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.528 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bc7bee3c-2fb0-416e-8785-dd231a8b6061]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493235, 'reachable_time': 21719, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233259, 'error': None, 'target': 'ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.530 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-471b274e-0e71-4d3c-ac3e-42e8f82f9dbc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:51:01 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:01.530 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[2c064122-0156-4c90-84cb-6b69af379783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:02 np0005470441 systemd[1]: run-netns-ovnmeta\x2d471b274e\x2d0e71\x2d4d3c\x2dac3e\x2d42e8f82f9dbc.mount: Deactivated successfully.
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.825 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-unplugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.826 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.827 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.827 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.828 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] No waiting events found dispatching network-vif-unplugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.828 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-unplugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.829 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.829 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.829 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.830 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.830 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] No waiting events found dispatching network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.831 2 WARNING nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received unexpected event network-vif-plugged-04bbe94a-5455-4c06-ba6c-4a5efffa9049 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.831 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-unplugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.832 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.832 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.833 2 DEBUG oslo_concurrency.lockutils [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.833 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] No waiting events found dispatching network-vif-unplugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:51:02 np0005470441 nova_compute[192626]: 2025-10-04 05:51:02.834 2 DEBUG nova.compute.manager [req-58abd54b-359d-4c48-b7f4-8130e5eefa5f req-a82d73a8-abc2-4f8c-8804-3bcdbd907b0e 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-unplugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.363 2 DEBUG nova.network.neutron [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updated VIF entry in instance network info cache for port 04bbe94a-5455-4c06-ba6c-4a5efffa9049. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.364 2 DEBUG nova.network.neutron [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updating instance_info_cache with network_info: [{"id": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "address": "fa:16:3e:22:a8:28", "network": {"id": "114f6534-09ca-4cdb-b1a7-f666738e5e43", "bridge": "br-int", "label": "tempest-network-smoke--290166077", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap04bbe94a-54", "ovs_interfaceid": "04bbe94a-5455-4c06-ba6c-4a5efffa9049", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "address": "fa:16:3e:44:9c:4c", "network": {"id": "471b274e-0e71-4d3c-ac3e-42e8f82f9dbc", "bridge": "br-int", "label": "tempest-network-smoke--1299760277", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe44:9c4c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap75cd5dc5-98", "ovs_interfaceid": "75cd5dc5-982f-4aa2-9be0-bf6346669b8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.403 2 DEBUG oslo_concurrency.lockutils [req-8b160cfa-3ef6-4bf8-87f6-26492d058695 req-a6be49a2-970a-4605-9f5c-caba574bba04 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-89f82b75-7054-4ae1-8b6f-7e13cd7789cc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.705 2 DEBUG nova.network.neutron [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.730 2 INFO nova.compute.manager [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Took 2.36 seconds to deallocate network for instance.#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.785 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.786 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.851 2 DEBUG nova.compute.provider_tree [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.871 2 DEBUG nova.scheduler.client.report [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.893 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.913 2 INFO nova.scheduler.client.report [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance 89f82b75-7054-4ae1-8b6f-7e13cd7789cc#033[00m
Oct  4 01:51:03 np0005470441 nova_compute[192626]: 2025-10-04 05:51:03.989 2 DEBUG oslo_concurrency.lockutils [None req-0fd1bb5a-b3fb-4f4e-af21-1c0dc8ba6914 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.915 2 DEBUG nova.compute.manager [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-deleted-04bbe94a-5455-4c06-ba6c-4a5efffa9049 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.915 2 DEBUG nova.compute.manager [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.916 2 DEBUG oslo_concurrency.lockutils [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.916 2 DEBUG oslo_concurrency.lockutils [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.916 2 DEBUG oslo_concurrency.lockutils [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "89f82b75-7054-4ae1-8b6f-7e13cd7789cc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.916 2 DEBUG nova.compute.manager [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] No waiting events found dispatching network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.916 2 WARNING nova.compute.manager [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received unexpected event network-vif-plugged-75cd5dc5-982f-4aa2-9be0-bf6346669b8b for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:51:04 np0005470441 nova_compute[192626]: 2025-10-04 05:51:04.916 2 DEBUG nova.compute.manager [req-f95cdbae-0064-45f4-a5f2-bcc9ff556300 req-13096d6b-06dd-4f40-865e-276c7255b271 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Received event network-vif-deleted-75cd5dc5-982f-4aa2-9be0-bf6346669b8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:51:06 np0005470441 nova_compute[192626]: 2025-10-04 05:51:06.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:06.759 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:06.759 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:06.759 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:08 np0005470441 nova_compute[192626]: 2025-10-04 05:51:08.043 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:08 np0005470441 nova_compute[192626]: 2025-10-04 05:51:08.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:08 np0005470441 nova_compute[192626]: 2025-10-04 05:51:08.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:09 np0005470441 podman[233261]: 2025-10-04 05:51:09.314791062 +0000 UTC m=+0.062870189 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:51:09 np0005470441 podman[233260]: 2025-10-04 05:51:09.345077073 +0000 UTC m=+0.093010316 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  4 01:51:11 np0005470441 nova_compute[192626]: 2025-10-04 05:51:11.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:13 np0005470441 nova_compute[192626]: 2025-10-04 05:51:13.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:13 np0005470441 nova_compute[192626]: 2025-10-04 05:51:13.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:13 np0005470441 nova_compute[192626]: 2025-10-04 05:51:13.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:51:14 np0005470441 podman[233300]: 2025-10-04 05:51:14.311900309 +0000 UTC m=+0.064846366 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:51:14 np0005470441 podman[233301]: 2025-10-04 05:51:14.325436879 +0000 UTC m=+0.068215434 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:51:14 np0005470441 nova_compute[192626]: 2025-10-04 05:51:14.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:14 np0005470441 nova_compute[192626]: 2025-10-04 05:51:14.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:51:14 np0005470441 nova_compute[192626]: 2025-10-04 05:51:14.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:51:14 np0005470441 nova_compute[192626]: 2025-10-04 05:51:14.732 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:51:14 np0005470441 nova_compute[192626]: 2025-10-04 05:51:14.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:16 np0005470441 nova_compute[192626]: 2025-10-04 05:51:16.282 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759557061.2804542, 89f82b75-7054-4ae1-8b6f-7e13cd7789cc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:51:16 np0005470441 nova_compute[192626]: 2025-10-04 05:51:16.283 2 INFO nova.compute.manager [-] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:51:16 np0005470441 nova_compute[192626]: 2025-10-04 05:51:16.309 2 DEBUG nova.compute.manager [None req-e8204140-84a0-4473-a0ca-befb803a83c9 - - - - - -] [instance: 89f82b75-7054-4ae1-8b6f-7e13cd7789cc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:51:16 np0005470441 nova_compute[192626]: 2025-10-04 05:51:16.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:17 np0005470441 nova_compute[192626]: 2025-10-04 05:51:17.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:17 np0005470441 nova_compute[192626]: 2025-10-04 05:51:17.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:17 np0005470441 nova_compute[192626]: 2025-10-04 05:51:17.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.748 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.748 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.749 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.749 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.898 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.900 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5715MB free_disk=73.42041778564453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.900 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.900 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.960 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.961 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:51:18 np0005470441 nova_compute[192626]: 2025-10-04 05:51:18.989 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:51:19 np0005470441 nova_compute[192626]: 2025-10-04 05:51:19.003 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:51:19 np0005470441 nova_compute[192626]: 2025-10-04 05:51:19.032 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:51:19 np0005470441 nova_compute[192626]: 2025-10-04 05:51:19.033 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:51:20 np0005470441 nova_compute[192626]: 2025-10-04 05:51:20.028 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:20 np0005470441 podman[233340]: 2025-10-04 05:51:20.375490892 +0000 UTC m=+0.106791442 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git)
Oct  4 01:51:21 np0005470441 nova_compute[192626]: 2025-10-04 05:51:21.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:21 np0005470441 nova_compute[192626]: 2025-10-04 05:51:21.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:23 np0005470441 nova_compute[192626]: 2025-10-04 05:51:23.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:23 np0005470441 nova_compute[192626]: 2025-10-04 05:51:23.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:51:26 np0005470441 nova_compute[192626]: 2025-10-04 05:51:26.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:27 np0005470441 podman[233361]: 2025-10-04 05:51:27.312938901 +0000 UTC m=+0.065491905 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:51:28 np0005470441 podman[233385]: 2025-10-04 05:51:28.338156909 +0000 UTC m=+0.068509652 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:51:28 np0005470441 nova_compute[192626]: 2025-10-04 05:51:28.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:31 np0005470441 nova_compute[192626]: 2025-10-04 05:51:31.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:31 np0005470441 podman[233405]: 2025-10-04 05:51:31.364345414 +0000 UTC m=+0.108637925 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:51:33 np0005470441 nova_compute[192626]: 2025-10-04 05:51:33.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:36 np0005470441 nova_compute[192626]: 2025-10-04 05:51:36.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:38 np0005470441 nova_compute[192626]: 2025-10-04 05:51:38.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:39.363 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:63:67 2001:db8:0:1:f816:3eff:fed5:6367 2001:db8::f816:3eff:fed5:6367'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fed5:6367/64 2001:db8::f816:3eff:fed5:6367/64', 'neutron:device_id': 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c944d58f-ea5b-4aae-8c43-03ee9fbeff86, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=19700a80-c8b3-48b0-af80-6b1d8e6ad9c1) old=Port_Binding(mac=['fa:16:3e:d5:63:67 2001:db8::f816:3eff:fed5:6367'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed5:6367/64', 'neutron:device_id': 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:51:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:39.365 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 19700a80-c8b3-48b0-af80-6b1d8e6ad9c1 in datapath f186cb58-c427-4b55-b47a-91b4fed4e8b2 updated#033[00m
Oct  4 01:51:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:39.366 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f186cb58-c427-4b55-b47a-91b4fed4e8b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:51:39 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:39.368 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e0db13d1-2e63-4b07-85e8-599bcc4c3100]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:51:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:40.191 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:51:40 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:40.192 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:51:40 np0005470441 nova_compute[192626]: 2025-10-04 05:51:40.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:40 np0005470441 podman[233433]: 2025-10-04 05:51:40.341347718 +0000 UTC m=+0.071071375 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:51:40 np0005470441 podman[233432]: 2025-10-04 05:51:40.345532549 +0000 UTC m=+0.086307994 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:51:41 np0005470441 nova_compute[192626]: 2025-10-04 05:51:41.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:42 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:51:42.194 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:51:43 np0005470441 nova_compute[192626]: 2025-10-04 05:51:43.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:45 np0005470441 podman[233477]: 2025-10-04 05:51:45.327300814 +0000 UTC m=+0.068827010 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, config_id=iscsid, managed_by=edpm_ansible)
Oct  4 01:51:45 np0005470441 podman[233478]: 2025-10-04 05:51:45.349464672 +0000 UTC m=+0.095174579 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:51:46 np0005470441 nova_compute[192626]: 2025-10-04 05:51:46.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:48 np0005470441 nova_compute[192626]: 2025-10-04 05:51:48.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:51 np0005470441 podman[233519]: 2025-10-04 05:51:51.308647522 +0000 UTC m=+0.065911137 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  4 01:51:51 np0005470441 nova_compute[192626]: 2025-10-04 05:51:51.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:53 np0005470441 nova_compute[192626]: 2025-10-04 05:51:53.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:56 np0005470441 nova_compute[192626]: 2025-10-04 05:51:56.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:58 np0005470441 podman[233540]: 2025-10-04 05:51:58.302661468 +0000 UTC m=+0.054303223 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:51:58 np0005470441 nova_compute[192626]: 2025-10-04 05:51:58.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:51:59 np0005470441 podman[233564]: 2025-10-04 05:51:59.304444503 +0000 UTC m=+0.059322657 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  4 01:52:00 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:00Z|00350|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  4 01:52:01 np0005470441 nova_compute[192626]: 2025-10-04 05:52:01.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:02 np0005470441 podman[233583]: 2025-10-04 05:52:02.351036895 +0000 UTC m=+0.099358779 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:52:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:52:03 np0005470441 nova_compute[192626]: 2025-10-04 05:52:03.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:06 np0005470441 nova_compute[192626]: 2025-10-04 05:52:06.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:06.760 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:06.761 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:06.761 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:08 np0005470441 nova_compute[192626]: 2025-10-04 05:52:08.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:08 np0005470441 nova_compute[192626]: 2025-10-04 05:52:08.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:08 np0005470441 nova_compute[192626]: 2025-10-04 05:52:08.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:11 np0005470441 podman[233610]: 2025-10-04 05:52:11.3023509 +0000 UTC m=+0.053531691 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:52:11 np0005470441 podman[233609]: 2025-10-04 05:52:11.31173853 +0000 UTC m=+0.067030239 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:52:11 np0005470441 nova_compute[192626]: 2025-10-04 05:52:11.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:13 np0005470441 nova_compute[192626]: 2025-10-04 05:52:13.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:14 np0005470441 nova_compute[192626]: 2025-10-04 05:52:14.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:14 np0005470441 nova_compute[192626]: 2025-10-04 05:52:14.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:52:16 np0005470441 podman[233653]: 2025-10-04 05:52:16.326356569 +0000 UTC m=+0.082996338 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  4 01:52:16 np0005470441 podman[233654]: 2025-10-04 05:52:16.348851656 +0000 UTC m=+0.098678629 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  4 01:52:16 np0005470441 nova_compute[192626]: 2025-10-04 05:52:16.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:16 np0005470441 nova_compute[192626]: 2025-10-04 05:52:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:16 np0005470441 nova_compute[192626]: 2025-10-04 05:52:16.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:52:16 np0005470441 nova_compute[192626]: 2025-10-04 05:52:16.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:52:16 np0005470441 nova_compute[192626]: 2025-10-04 05:52:16.733 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:52:16 np0005470441 nova_compute[192626]: 2025-10-04 05:52:16.733 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.745 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.745 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.941 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.943 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.42043685913086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.943 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:18 np0005470441 nova_compute[192626]: 2025-10-04 05:52:18.944 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.002 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.003 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.026 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.043 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.043 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.062 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.084 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.107 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.123 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.126 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:52:19 np0005470441 nova_compute[192626]: 2025-10-04 05:52:19.126 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.183s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:21 np0005470441 nova_compute[192626]: 2025-10-04 05:52:21.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:22 np0005470441 podman[233694]: 2025-10-04 05:52:22.343676029 +0000 UTC m=+0.083353898 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Oct  4 01:52:23 np0005470441 nova_compute[192626]: 2025-10-04 05:52:23.127 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:23 np0005470441 nova_compute[192626]: 2025-10-04 05:52:23.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:24 np0005470441 nova_compute[192626]: 2025-10-04 05:52:24.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:52:26 np0005470441 nova_compute[192626]: 2025-10-04 05:52:26.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:28 np0005470441 nova_compute[192626]: 2025-10-04 05:52:28.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:28 np0005470441 podman[233714]: 2025-10-04 05:52:28.561940101 +0000 UTC m=+0.052025287 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:52:30 np0005470441 podman[233739]: 2025-10-04 05:52:30.314959774 +0000 UTC m=+0.057893086 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:52:31 np0005470441 nova_compute[192626]: 2025-10-04 05:52:31.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:33 np0005470441 podman[233760]: 2025-10-04 05:52:33.330573675 +0000 UTC m=+0.082420162 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:52:33 np0005470441 nova_compute[192626]: 2025-10-04 05:52:33.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:36 np0005470441 nova_compute[192626]: 2025-10-04 05:52:36.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:38 np0005470441 nova_compute[192626]: 2025-10-04 05:52:38.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.685 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.685 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.709 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.797 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.798 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.808 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.808 2 INFO nova.compute.claims [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.910 2 DEBUG nova.compute.provider_tree [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.925 2 DEBUG nova.scheduler.client.report [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.952 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:41 np0005470441 nova_compute[192626]: 2025-10-04 05:52:41.953 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.015 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.016 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.037 2 INFO nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.055 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.146 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.148 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.149 2 INFO nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Creating image(s)#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.150 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.151 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.153 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.179 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.276 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.277 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.278 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.294 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.321 2 DEBUG nova.policy [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:52:42 np0005470441 podman[233789]: 2025-10-04 05:52:42.327999507 +0000 UTC m=+0.073618308 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, org.label-schema.build-date=20251001)
Oct  4 01:52:42 np0005470441 podman[233790]: 2025-10-04 05:52:42.346418997 +0000 UTC m=+0.078068117 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.371 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.372 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.403 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.404 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.404 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.459 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.461 2 DEBUG nova.virt.disk.api [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.461 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.519 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.520 2 DEBUG nova.virt.disk.api [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.521 2 DEBUG nova.objects.instance [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid 1b4da3b1-cafb-41e8-8eae-ac240b41a891 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.638 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.638 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Ensure instance console log exists: /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.639 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.639 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:42 np0005470441 nova_compute[192626]: 2025-10-04 05:52:42.640 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:43 np0005470441 nova_compute[192626]: 2025-10-04 05:52:43.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:43.048 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:52:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:43.048 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:52:43 np0005470441 nova_compute[192626]: 2025-10-04 05:52:43.373 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Successfully created port: a5671074-2db1-4974-bdb0-f92d8745da99 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:52:43 np0005470441 nova_compute[192626]: 2025-10-04 05:52:43.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:43 np0005470441 nova_compute[192626]: 2025-10-04 05:52:43.956 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Successfully created port: 81781200-f6a4-4764-bbeb-e0994759b68a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:52:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:44.050 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:45 np0005470441 nova_compute[192626]: 2025-10-04 05:52:45.694 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Successfully updated port: a5671074-2db1-4974-bdb0-f92d8745da99 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:52:45 np0005470441 nova_compute[192626]: 2025-10-04 05:52:45.864 2 DEBUG nova.compute.manager [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-changed-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:45 np0005470441 nova_compute[192626]: 2025-10-04 05:52:45.865 2 DEBUG nova.compute.manager [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing instance network info cache due to event network-changed-a5671074-2db1-4974-bdb0-f92d8745da99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:52:45 np0005470441 nova_compute[192626]: 2025-10-04 05:52:45.865 2 DEBUG oslo_concurrency.lockutils [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:52:45 np0005470441 nova_compute[192626]: 2025-10-04 05:52:45.866 2 DEBUG oslo_concurrency.lockutils [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:52:45 np0005470441 nova_compute[192626]: 2025-10-04 05:52:45.866 2 DEBUG nova.network.neutron [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing network info cache for port a5671074-2db1-4974-bdb0-f92d8745da99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:52:46 np0005470441 nova_compute[192626]: 2025-10-04 05:52:46.284 2 DEBUG nova.network.neutron [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:52:46 np0005470441 nova_compute[192626]: 2025-10-04 05:52:46.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:46 np0005470441 nova_compute[192626]: 2025-10-04 05:52:46.647 2 DEBUG nova.network.neutron [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:52:46 np0005470441 nova_compute[192626]: 2025-10-04 05:52:46.662 2 DEBUG oslo_concurrency.lockutils [req-61b95300-a832-4da3-9853-ac800fa033ee req-74fccfb6-8bb6-4571-a5dd-c3be1f671268 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:52:47 np0005470441 podman[233845]: 2025-10-04 05:52:47.325698828 +0000 UTC m=+0.080555259 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.324 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Successfully updated port: 81781200-f6a4-4764-bbeb-e0994759b68a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.342 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.343 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.343 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:52:47 np0005470441 podman[233846]: 2025-10-04 05:52:47.346814906 +0000 UTC m=+0.089047302 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.466 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.954 2 DEBUG nova.compute.manager [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-changed-81781200-f6a4-4764-bbeb-e0994759b68a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.954 2 DEBUG nova.compute.manager [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing instance network info cache due to event network-changed-81781200-f6a4-4764-bbeb-e0994759b68a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:52:47 np0005470441 nova_compute[192626]: 2025-10-04 05:52:47.955 2 DEBUG oslo_concurrency.lockutils [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:52:48 np0005470441 nova_compute[192626]: 2025-10-04 05:52:48.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.302 2 DEBUG nova.network.neutron [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.334 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.334 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Instance network_info: |[{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.335 2 DEBUG oslo_concurrency.lockutils [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.335 2 DEBUG nova.network.neutron [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing network info cache for port 81781200-f6a4-4764-bbeb-e0994759b68a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.343 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Start _get_guest_xml network_info=[{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.350 2 WARNING nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.356 2 DEBUG nova.virt.libvirt.host [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.357 2 DEBUG nova.virt.libvirt.host [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.368 2 DEBUG nova.virt.libvirt.host [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.368 2 DEBUG nova.virt.libvirt.host [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.370 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.371 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.372 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.372 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.373 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.373 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.374 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.374 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.375 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.375 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.375 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.376 2 DEBUG nova.virt.hardware [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.379 2 DEBUG nova.virt.libvirt.vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1379993306',display_name='tempest-TestGettingAddress-server-1379993306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1379993306',id=49,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTWzVp+scbD8iO/EAR4ii6lHpq8wkJIOGeyaeDGeWZ3nhAnZnCPnEzd5XscaYV5xeRNVx9GnNUiLXi7an+E96rzIAk6FQs1pn5oevAy5JYTBgyKN8p+TR1bqN3xQ1yOxQ==',key_name='tempest-TestGettingAddress-717657981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0tx4870u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:52:42Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=1b4da3b1-cafb-41e8-8eae-ac240b41a891,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.380 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.380 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.381 2 DEBUG nova.virt.libvirt.vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1379993306',display_name='tempest-TestGettingAddress-server-1379993306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1379993306',id=49,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTWzVp+scbD8iO/EAR4ii6lHpq8wkJIOGeyaeDGeWZ3nhAnZnCPnEzd5XscaYV5xeRNVx9GnNUiLXi7an+E96rzIAk6FQs1pn5oevAy5JYTBgyKN8p+TR1bqN3xQ1yOxQ==',key_name='tempest-TestGettingAddress-717657981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0tx4870u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:52:42Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=1b4da3b1-cafb-41e8-8eae-ac240b41a891,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.381 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.382 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.383 2 DEBUG nova.objects.instance [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1b4da3b1-cafb-41e8-8eae-ac240b41a891 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.400 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <uuid>1b4da3b1-cafb-41e8-8eae-ac240b41a891</uuid>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <name>instance-00000031</name>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-1379993306</nova:name>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:52:49</nova:creationTime>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:port uuid="a5671074-2db1-4974-bdb0-f92d8745da99">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        <nova:port uuid="81781200-f6a4-4764-bbeb-e0994759b68a">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe38:ad1e" ipVersion="6"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe38:ad1e" ipVersion="6"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <entry name="serial">1b4da3b1-cafb-41e8-8eae-ac240b41a891</entry>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <entry name="uuid">1b4da3b1-cafb-41e8-8eae-ac240b41a891</entry>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.config"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:26:fa:3f"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <target dev="tapa5671074-2d"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:38:ad:1e"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <target dev="tap81781200-f6"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/console.log" append="off"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:52:49 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:52:49 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:52:49 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:52:49 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.400 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Preparing to wait for external event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.400 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.400 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.401 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.401 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Preparing to wait for external event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.401 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.401 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.401 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.402 2 DEBUG nova.virt.libvirt.vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1379993306',display_name='tempest-TestGettingAddress-server-1379993306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1379993306',id=49,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTWzVp+scbD8iO/EAR4ii6lHpq8wkJIOGeyaeDGeWZ3nhAnZnCPnEzd5XscaYV5xeRNVx9GnNUiLXi7an+E96rzIAk6FQs1pn5oevAy5JYTBgyKN8p+TR1bqN3xQ1yOxQ==',key_name='tempest-TestGettingAddress-717657981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0tx4870u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:52:42Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=1b4da3b1-cafb-41e8-8eae-ac240b41a891,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.402 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.402 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.403 2 DEBUG os_vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.404 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.404 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.408 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5671074-2d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.408 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5671074-2d, col_values=(('external_ids', {'iface-id': 'a5671074-2db1-4974-bdb0-f92d8745da99', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:fa:3f', 'vm-uuid': '1b4da3b1-cafb-41e8-8eae-ac240b41a891'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 NetworkManager[51690]: <info>  [1759557169.4109] manager: (tapa5671074-2d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.418 2 INFO os_vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d')#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.419 2 DEBUG nova.virt.libvirt.vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1379993306',display_name='tempest-TestGettingAddress-server-1379993306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1379993306',id=49,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTWzVp+scbD8iO/EAR4ii6lHpq8wkJIOGeyaeDGeWZ3nhAnZnCPnEzd5XscaYV5xeRNVx9GnNUiLXi7an+E96rzIAk6FQs1pn5oevAy5JYTBgyKN8p+TR1bqN3xQ1yOxQ==',key_name='tempest-TestGettingAddress-717657981',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0tx4870u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:52:42Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=1b4da3b1-cafb-41e8-8eae-ac240b41a891,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.419 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.420 2 DEBUG nova.network.os_vif_util [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.420 2 DEBUG os_vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.420 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.421 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.423 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81781200-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.423 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap81781200-f6, col_values=(('external_ids', {'iface-id': '81781200-f6a4-4764-bbeb-e0994759b68a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:ad:1e', 'vm-uuid': '1b4da3b1-cafb-41e8-8eae-ac240b41a891'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 NetworkManager[51690]: <info>  [1759557169.4253] manager: (tap81781200-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.432 2 INFO os_vif [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6')#033[00m
Oct  4 01:52:49 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:52:49 np0005470441 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.533 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.534 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.534 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:26:fa:3f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.534 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:38:ad:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.535 2 INFO nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Using config drive#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.863 2 INFO nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Creating config drive at /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.config#033[00m
Oct  4 01:52:49 np0005470441 nova_compute[192626]: 2025-10-04 05:52:49.873 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpffmbji_v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.008 2 DEBUG oslo_concurrency.processutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpffmbji_v" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:52:50 np0005470441 kernel: tapa5671074-2d: entered promiscuous mode
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.1142] manager: (tapa5671074-2d): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00351|binding|INFO|Claiming lport a5671074-2db1-4974-bdb0-f92d8745da99 for this chassis.
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00352|binding|INFO|a5671074-2db1-4974-bdb0-f92d8745da99: Claiming fa:16:3e:26:fa:3f 10.100.0.7
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.1331] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.1346] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.1369] manager: (tap81781200-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.137 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:fa:3f 10.100.0.7'], port_security=['fa:16:3e:26:fa:3f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1b4da3b1-cafb-41e8-8eae-ac240b41a891', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-199b9470-5ecb-408d-8026-8037800b7e96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '630fd102-b50c-4f61-a7b2-ff563ea84794', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=352f76fb-425b-4d46-85bc-fbca63f72e0a, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a5671074-2db1-4974-bdb0-f92d8745da99) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.138 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a5671074-2db1-4974-bdb0-f92d8745da99 in datapath 199b9470-5ecb-408d-8026-8037800b7e96 bound to our chassis#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.139 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 199b9470-5ecb-408d-8026-8037800b7e96#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.156 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bf8dc6d8-1431-401c-af6c-df5cbdc639ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.157 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap199b9470-51 in ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:52:50 np0005470441 systemd-udevd[233910]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:52:50 np0005470441 systemd-udevd[233911]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.159 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap199b9470-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.160 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f02ca1cb-8b0c-445a-8a28-3a3c5719715c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.161 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e37ea602-73d0-48fb-a617-69551406463e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.173 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[20c5fd4b-35a4-49f7-96a6-8ec16092e226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.1781] device (tapa5671074-2d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.1792] device (tapa5671074-2d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:52:50 np0005470441 systemd-machined[152624]: New machine qemu-27-instance-00000031.
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.200 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7106a892-64a0-406e-bd16-a87a00d17d21]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 systemd[1]: Started Virtual Machine qemu-27-instance-00000031.
Oct  4 01:52:50 np0005470441 kernel: tap81781200-f6: entered promiscuous mode
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.2256] device (tap81781200-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.2271] device (tap81781200-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00353|binding|INFO|Claiming lport 81781200-f6a4-4764-bbeb-e0994759b68a for this chassis.
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00354|binding|INFO|81781200-f6a4-4764-bbeb-e0994759b68a: Claiming fa:16:3e:38:ad:1e 2001:db8:0:1:f816:3eff:fe38:ad1e 2001:db8::f816:3eff:fe38:ad1e
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00355|binding|INFO|Setting lport a5671074-2db1-4974-bdb0-f92d8745da99 ovn-installed in OVS
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.245 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0e1bba-d558-45f7-b344-d6fca2d4e9e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.251 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:ad:1e 2001:db8:0:1:f816:3eff:fe38:ad1e 2001:db8::f816:3eff:fe38:ad1e'], port_security=['fa:16:3e:38:ad:1e 2001:db8:0:1:f816:3eff:fe38:ad1e 2001:db8::f816:3eff:fe38:ad1e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe38:ad1e/64 2001:db8::f816:3eff:fe38:ad1e/64', 'neutron:device_id': '1b4da3b1-cafb-41e8-8eae-ac240b41a891', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '630fd102-b50c-4f61-a7b2-ff563ea84794', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c944d58f-ea5b-4aae-8c43-03ee9fbeff86, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=81781200-f6a4-4764-bbeb-e0994759b68a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00356|binding|INFO|Setting lport a5671074-2db1-4974-bdb0-f92d8745da99 up in Southbound
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00357|binding|INFO|Setting lport 81781200-f6a4-4764-bbeb-e0994759b68a ovn-installed in OVS
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.2642] manager: (tap199b9470-50): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00358|binding|INFO|Setting lport 81781200-f6a4-4764-bbeb-e0994759b68a up in Southbound
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.264 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[830a1d65-a74e-4100-8d2d-658e8c2803b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.307 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[08bfc9ab-d1c4-49ae-8c6d-9ec9900b248d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.310 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d6ceb3-6864-4c85-92e2-0d15e6d93228]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.3320] device (tap199b9470-50): carrier: link connected
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.338 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8a30dc-758f-45a0-9713-9e7d2dd41791]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.355 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c910abce-00f7-4f1a-9e62-924f07b659be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap199b9470-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:51:67:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506798, 'reachable_time': 42637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233948, 'error': None, 'target': 'ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.370 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5fd8ae-bf90-4052-8a59-218c17826249]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe51:671c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506798, 'tstamp': 506798}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233949, 'error': None, 'target': 'ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.389 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[15371013-f3c2-442e-ab33-718549d14007]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap199b9470-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:51:67:1c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506798, 'reachable_time': 42637, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233950, 'error': None, 'target': 'ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.420 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8498c96d-7161-4594-b279-df59f8f371aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.450 2 DEBUG nova.compute.manager [req-1da3551a-1ce5-4023-9d58-3c3a21163161 req-9c96bcf2-7e99-4e41-b457-3b5da71a3fa1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.451 2 DEBUG oslo_concurrency.lockutils [req-1da3551a-1ce5-4023-9d58-3c3a21163161 req-9c96bcf2-7e99-4e41-b457-3b5da71a3fa1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.451 2 DEBUG oslo_concurrency.lockutils [req-1da3551a-1ce5-4023-9d58-3c3a21163161 req-9c96bcf2-7e99-4e41-b457-3b5da71a3fa1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.452 2 DEBUG oslo_concurrency.lockutils [req-1da3551a-1ce5-4023-9d58-3c3a21163161 req-9c96bcf2-7e99-4e41-b457-3b5da71a3fa1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.452 2 DEBUG nova.compute.manager [req-1da3551a-1ce5-4023-9d58-3c3a21163161 req-9c96bcf2-7e99-4e41-b457-3b5da71a3fa1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Processing event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.485 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[24ef9dc6-1251-45a0-89eb-6ef355c9a9b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.486 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap199b9470-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.487 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.488 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap199b9470-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 NetworkManager[51690]: <info>  [1759557170.4908] manager: (tap199b9470-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Oct  4 01:52:50 np0005470441 kernel: tap199b9470-50: entered promiscuous mode
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.495 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap199b9470-50, col_values=(('external_ids', {'iface-id': '4d89df05-290a-4145-aafc-e25c9b03deb3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:50Z|00359|binding|INFO|Releasing lport 4d89df05-290a-4145-aafc-e25c9b03deb3 from this chassis (sb_readonly=0)
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.500 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/199b9470-5ecb-408d-8026-8037800b7e96.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/199b9470-5ecb-408d-8026-8037800b7e96.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.501 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a8b7b8-53a7-41f2-8fb5-b9bdeac98be9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.502 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-199b9470-5ecb-408d-8026-8037800b7e96
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/199b9470-5ecb-408d-8026-8037800b7e96.pid.haproxy
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 199b9470-5ecb-408d-8026-8037800b7e96
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:52:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:50.503 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96', 'env', 'PROCESS_TAG=haproxy-199b9470-5ecb-408d-8026-8037800b7e96', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/199b9470-5ecb-408d-8026-8037800b7e96.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.520 2 DEBUG nova.compute.manager [req-115fff80-4883-455d-88be-10d20b325e6c req-b047a338-f3b8-4f44-9f57-5de839bb0e51 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.521 2 DEBUG oslo_concurrency.lockutils [req-115fff80-4883-455d-88be-10d20b325e6c req-b047a338-f3b8-4f44-9f57-5de839bb0e51 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.521 2 DEBUG oslo_concurrency.lockutils [req-115fff80-4883-455d-88be-10d20b325e6c req-b047a338-f3b8-4f44-9f57-5de839bb0e51 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.522 2 DEBUG oslo_concurrency.lockutils [req-115fff80-4883-455d-88be-10d20b325e6c req-b047a338-f3b8-4f44-9f57-5de839bb0e51 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.522 2 DEBUG nova.compute.manager [req-115fff80-4883-455d-88be-10d20b325e6c req-b047a338-f3b8-4f44-9f57-5de839bb0e51 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Processing event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.825 2 DEBUG nova.network.neutron [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updated VIF entry in instance network info cache for port 81781200-f6a4-4764-bbeb-e0994759b68a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.826 2 DEBUG nova.network.neutron [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.843 2 DEBUG oslo_concurrency.lockutils [req-3a33a391-751b-42ed-8e8f-2e3b798b1c2a req-27d6e02c-bf1b-4e22-b19a-9b192c10e7a2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.946 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557170.9460008, 1b4da3b1-cafb-41e8-8eae-ac240b41a891 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.947 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] VM Started (Lifecycle Event)#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.951 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.956 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.960 2 INFO nova.virt.libvirt.driver [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Instance spawned successfully.#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.961 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:52:50 np0005470441 podman[233988]: 2025-10-04 05:52:50.962242501 +0000 UTC m=+0.096393604 container create c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.968 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.973 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.982 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.984 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.985 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.986 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.987 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.987 2 DEBUG nova.virt.libvirt.driver [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:52:50 np0005470441 podman[233988]: 2025-10-04 05:52:50.899357812 +0000 UTC m=+0.033509015 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.995 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.996 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557170.9462013, 1b4da3b1-cafb-41e8-8eae-ac240b41a891 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:52:50 np0005470441 nova_compute[192626]: 2025-10-04 05:52:50.997 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:52:51 np0005470441 systemd[1]: Started libpod-conmon-c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77.scope.
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.023 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.029 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557170.9547796, 1b4da3b1-cafb-41e8-8eae-ac240b41a891 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.029 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:52:51 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:52:51 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05c29a4f3835dd1ee7fc8588c8e3c1bd96c39f46e1830a58b222ebf172c670ca/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.049 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.054 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.060 2 INFO nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Took 8.91 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.060 2 DEBUG nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:52:51 np0005470441 podman[233988]: 2025-10-04 05:52:51.061850986 +0000 UTC m=+0.196002099 container init c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.070 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:52:51 np0005470441 podman[233988]: 2025-10-04 05:52:51.071769941 +0000 UTC m=+0.205921054 container start c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 01:52:51 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [NOTICE]   (234008) : New worker (234010) forked
Oct  4 01:52:51 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [NOTICE]   (234008) : Loading success.
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.134 2 INFO nova.compute.manager [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Took 9.37 seconds to build instance.#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.137 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 81781200-f6a4-4764-bbeb-e0994759b68a in datapath f186cb58-c427-4b55-b47a-91b4fed4e8b2 unbound from our chassis#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.139 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f186cb58-c427-4b55-b47a-91b4fed4e8b2#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.151 2 DEBUG oslo_concurrency.lockutils [None req-1bdee2e9-a3fa-4b31-bb33-4525512d4286 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.150 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[10b06357-1664-4991-a010-69a866edc827]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.152 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf186cb58-c1 in ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.154 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf186cb58-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.154 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ada549fa-fb4b-42ed-9f9f-50154ce5410e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.155 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5fa57c24-ae53-435a-8dfb-340b9154f230]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.170 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd34732-2b29-430f-975e-8422fa8823e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.193 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c2fc60e1-d0dc-4f76-a62c-fc1a8c2f6b28]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.227 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[031f1601-9dd7-46f9-9c5f-a1503725645b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 NetworkManager[51690]: <info>  [1759557171.2360] manager: (tapf186cb58-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.234 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f32509c-3b0c-44a7-9644-6a52fd240882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 systemd-udevd[233937]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.271 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[315d6b70-06a7-4a47-b324-0fe9a3633b1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.275 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[dae4cf15-5302-44bd-affd-a05aeb70c9f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 NetworkManager[51690]: <info>  [1759557171.3068] device (tapf186cb58-c0): carrier: link connected
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.314 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[8ea8e682-591d-41da-b60c-0fc6f31ae73f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.335 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2f949238-56f4-477d-8c28-ab14d80b87b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf186cb58-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:63:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506896, 'reachable_time': 29456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234029, 'error': None, 'target': 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.357 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cbc0f60c-f792-4912-81d7-32a874f84e08]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:6367'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 506896, 'tstamp': 506896}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234030, 'error': None, 'target': 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.375 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[4d19c18d-34ba-4f60-a733-6ce9aacb935d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf186cb58-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:63:67'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 99], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506896, 'reachable_time': 29456, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234031, 'error': None, 'target': 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.414 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b0db773c-6e7e-4a22-a96b-fc58b7244135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.455 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[880d951d-9754-47ef-b57c-1c55b72cf55d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.457 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf186cb58-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.457 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.458 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf186cb58-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:51 np0005470441 NetworkManager[51690]: <info>  [1759557171.4613] manager: (tapf186cb58-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Oct  4 01:52:51 np0005470441 kernel: tapf186cb58-c0: entered promiscuous mode
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.463 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf186cb58-c0, col_values=(('external_ids', {'iface-id': '19700a80-c8b3-48b0-af80-6b1d8e6ad9c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:51 np0005470441 ovn_controller[94840]: 2025-10-04T05:52:51Z|00360|binding|INFO|Releasing lport 19700a80-c8b3-48b0-af80-6b1d8e6ad9c1 from this chassis (sb_readonly=0)
Oct  4 01:52:51 np0005470441 nova_compute[192626]: 2025-10-04 05:52:51.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.493 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f186cb58-c427-4b55-b47a-91b4fed4e8b2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f186cb58-c427-4b55-b47a-91b4fed4e8b2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.494 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9e07ff7e-2de0-4876-a33a-ee0f6e87b720]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.495 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-f186cb58-c427-4b55-b47a-91b4fed4e8b2
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/f186cb58-c427-4b55-b47a-91b4fed4e8b2.pid.haproxy
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID f186cb58-c427-4b55-b47a-91b4fed4e8b2
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:52:51 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:52:51.495 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'env', 'PROCESS_TAG=haproxy-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f186cb58-c427-4b55-b47a-91b4fed4e8b2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:52:51 np0005470441 podman[234061]: 2025-10-04 05:52:51.948814398 +0000 UTC m=+0.085310205 container create 6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  4 01:52:51 np0005470441 podman[234061]: 2025-10-04 05:52:51.891943762 +0000 UTC m=+0.028439659 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:52:51 np0005470441 systemd[1]: Started libpod-conmon-6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5.scope.
Oct  4 01:52:52 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:52:52 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83bbbf9da96c061bf99abcc7309449bea4ce565ea896870cc8608d4f739c87d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:52:52 np0005470441 podman[234061]: 2025-10-04 05:52:52.050180194 +0000 UTC m=+0.186676021 container init 6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  4 01:52:52 np0005470441 podman[234061]: 2025-10-04 05:52:52.057438333 +0000 UTC m=+0.193934140 container start 6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:52:52 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [NOTICE]   (234080) : New worker (234082) forked
Oct  4 01:52:52 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [NOTICE]   (234080) : Loading success.
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.533 2 DEBUG nova.compute.manager [req-a9bec18e-35ea-4618-80de-d4a9475b668b req-4439b379-050c-4eba-8bb7-1047aab8bfef 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.534 2 DEBUG oslo_concurrency.lockutils [req-a9bec18e-35ea-4618-80de-d4a9475b668b req-4439b379-050c-4eba-8bb7-1047aab8bfef 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.534 2 DEBUG oslo_concurrency.lockutils [req-a9bec18e-35ea-4618-80de-d4a9475b668b req-4439b379-050c-4eba-8bb7-1047aab8bfef 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.535 2 DEBUG oslo_concurrency.lockutils [req-a9bec18e-35ea-4618-80de-d4a9475b668b req-4439b379-050c-4eba-8bb7-1047aab8bfef 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.535 2 DEBUG nova.compute.manager [req-a9bec18e-35ea-4618-80de-d4a9475b668b req-4439b379-050c-4eba-8bb7-1047aab8bfef 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] No waiting events found dispatching network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.536 2 WARNING nova.compute.manager [req-a9bec18e-35ea-4618-80de-d4a9475b668b req-4439b379-050c-4eba-8bb7-1047aab8bfef 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received unexpected event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a for instance with vm_state active and task_state None.#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.586 2 DEBUG nova.compute.manager [req-dcc5b984-e231-476b-810b-fab6f4c08d18 req-fa65c846-9458-4601-82fc-c0024ddb5d25 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.586 2 DEBUG oslo_concurrency.lockutils [req-dcc5b984-e231-476b-810b-fab6f4c08d18 req-fa65c846-9458-4601-82fc-c0024ddb5d25 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.587 2 DEBUG oslo_concurrency.lockutils [req-dcc5b984-e231-476b-810b-fab6f4c08d18 req-fa65c846-9458-4601-82fc-c0024ddb5d25 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.587 2 DEBUG oslo_concurrency.lockutils [req-dcc5b984-e231-476b-810b-fab6f4c08d18 req-fa65c846-9458-4601-82fc-c0024ddb5d25 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.587 2 DEBUG nova.compute.manager [req-dcc5b984-e231-476b-810b-fab6f4c08d18 req-fa65c846-9458-4601-82fc-c0024ddb5d25 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] No waiting events found dispatching network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:52:52 np0005470441 nova_compute[192626]: 2025-10-04 05:52:52.588 2 WARNING nova.compute.manager [req-dcc5b984-e231-476b-810b-fab6f4c08d18 req-fa65c846-9458-4601-82fc-c0024ddb5d25 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received unexpected event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:52:53 np0005470441 podman[234091]: 2025-10-04 05:52:53.338842731 +0000 UTC m=+0.078575581 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, version=9.6, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=)
Oct  4 01:52:53 np0005470441 nova_compute[192626]: 2025-10-04 05:52:53.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:54 np0005470441 nova_compute[192626]: 2025-10-04 05:52:54.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:55 np0005470441 nova_compute[192626]: 2025-10-04 05:52:55.507 2 DEBUG nova.compute.manager [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-changed-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:52:55 np0005470441 nova_compute[192626]: 2025-10-04 05:52:55.507 2 DEBUG nova.compute.manager [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing instance network info cache due to event network-changed-a5671074-2db1-4974-bdb0-f92d8745da99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:52:55 np0005470441 nova_compute[192626]: 2025-10-04 05:52:55.507 2 DEBUG oslo_concurrency.lockutils [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:52:55 np0005470441 nova_compute[192626]: 2025-10-04 05:52:55.508 2 DEBUG oslo_concurrency.lockutils [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:52:55 np0005470441 nova_compute[192626]: 2025-10-04 05:52:55.508 2 DEBUG nova.network.neutron [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing network info cache for port a5671074-2db1-4974-bdb0-f92d8745da99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:52:57 np0005470441 nova_compute[192626]: 2025-10-04 05:52:57.630 2 DEBUG nova.network.neutron [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updated VIF entry in instance network info cache for port a5671074-2db1-4974-bdb0-f92d8745da99. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:52:57 np0005470441 nova_compute[192626]: 2025-10-04 05:52:57.632 2 DEBUG nova.network.neutron [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:52:57 np0005470441 nova_compute[192626]: 2025-10-04 05:52:57.653 2 DEBUG oslo_concurrency.lockutils [req-774c2b2d-cf17-46f1-8238-13114ae09df8 req-1d49b12c-c91f-47a9-acbf-58faded2130a 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:52:58 np0005470441 nova_compute[192626]: 2025-10-04 05:52:58.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:52:59 np0005470441 podman[234112]: 2025-10-04 05:52:59.32270507 +0000 UTC m=+0.069495510 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:52:59 np0005470441 nova_compute[192626]: 2025-10-04 05:52:59.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:01 np0005470441 podman[234137]: 2025-10-04 05:53:01.325162169 +0000 UTC m=+0.066833144 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:53:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:03Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:26:fa:3f 10.100.0.7
Oct  4 01:53:03 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:03Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:26:fa:3f 10.100.0.7
Oct  4 01:53:03 np0005470441 nova_compute[192626]: 2025-10-04 05:53:03.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:04 np0005470441 podman[234169]: 2025-10-04 05:53:04.339255946 +0000 UTC m=+0.089409813 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  4 01:53:04 np0005470441 nova_compute[192626]: 2025-10-04 05:53:04.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:06.762 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:06.763 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:06.764 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:08 np0005470441 nova_compute[192626]: 2025-10-04 05:53:08.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:08 np0005470441 nova_compute[192626]: 2025-10-04 05:53:08.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:09 np0005470441 nova_compute[192626]: 2025-10-04 05:53:09.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:09 np0005470441 nova_compute[192626]: 2025-10-04 05:53:09.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:13 np0005470441 podman[234196]: 2025-10-04 05:53:13.30686441 +0000 UTC m=+0.047121616 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:53:13 np0005470441 podman[234195]: 2025-10-04 05:53:13.311221605 +0000 UTC m=+0.054725725 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 01:53:13 np0005470441 nova_compute[192626]: 2025-10-04 05:53:13.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.468 2 DEBUG nova.compute.manager [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-changed-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.469 2 DEBUG nova.compute.manager [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing instance network info cache due to event network-changed-a5671074-2db1-4974-bdb0-f92d8745da99. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.469 2 DEBUG oslo_concurrency.lockutils [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.469 2 DEBUG oslo_concurrency.lockutils [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.469 2 DEBUG nova.network.neutron [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Refreshing network info cache for port a5671074-2db1-4974-bdb0-f92d8745da99 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.551 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.552 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.552 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.552 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.552 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.553 2 INFO nova.compute.manager [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Terminating instance#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.554 2 DEBUG nova.compute.manager [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:53:14 np0005470441 kernel: tapa5671074-2d (unregistering): left promiscuous mode
Oct  4 01:53:14 np0005470441 NetworkManager[51690]: <info>  [1759557194.5772] device (tapa5671074-2d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:53:14 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:14Z|00361|binding|INFO|Releasing lport a5671074-2db1-4974-bdb0-f92d8745da99 from this chassis (sb_readonly=0)
Oct  4 01:53:14 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:14Z|00362|binding|INFO|Setting lport a5671074-2db1-4974-bdb0-f92d8745da99 down in Southbound
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:14Z|00363|binding|INFO|Removing iface tapa5671074-2d ovn-installed in OVS
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.593 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:fa:3f 10.100.0.7'], port_security=['fa:16:3e:26:fa:3f 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1b4da3b1-cafb-41e8-8eae-ac240b41a891', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-199b9470-5ecb-408d-8026-8037800b7e96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '630fd102-b50c-4f61-a7b2-ff563ea84794', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=352f76fb-425b-4d46-85bc-fbca63f72e0a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a5671074-2db1-4974-bdb0-f92d8745da99) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.595 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a5671074-2db1-4974-bdb0-f92d8745da99 in datapath 199b9470-5ecb-408d-8026-8037800b7e96 unbound from our chassis#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.596 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 199b9470-5ecb-408d-8026-8037800b7e96, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.597 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bba4fbe6-a472-4277-b063-97e40426631f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.598 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96 namespace which is not needed anymore#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 kernel: tap81781200-f6 (unregistering): left promiscuous mode
Oct  4 01:53:14 np0005470441 NetworkManager[51690]: <info>  [1759557194.6662] device (tap81781200-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:14Z|00364|binding|INFO|Releasing lport 81781200-f6a4-4764-bbeb-e0994759b68a from this chassis (sb_readonly=0)
Oct  4 01:53:14 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:14Z|00365|binding|INFO|Setting lport 81781200-f6a4-4764-bbeb-e0994759b68a down in Southbound
Oct  4 01:53:14 np0005470441 ovn_controller[94840]: 2025-10-04T05:53:14Z|00366|binding|INFO|Removing iface tap81781200-f6 ovn-installed in OVS
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.686 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:ad:1e 2001:db8:0:1:f816:3eff:fe38:ad1e 2001:db8::f816:3eff:fe38:ad1e'], port_security=['fa:16:3e:38:ad:1e 2001:db8:0:1:f816:3eff:fe38:ad1e 2001:db8::f816:3eff:fe38:ad1e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe38:ad1e/64 2001:db8::f816:3eff:fe38:ad1e/64', 'neutron:device_id': '1b4da3b1-cafb-41e8-8eae-ac240b41a891', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '630fd102-b50c-4f61-a7b2-ff563ea84794', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c944d58f-ea5b-4aae-8c43-03ee9fbeff86, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=81781200-f6a4-4764-bbeb-e0994759b68a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:53:14 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [NOTICE]   (234008) : haproxy version is 2.8.14-c23fe91
Oct  4 01:53:14 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [NOTICE]   (234008) : path to executable is /usr/sbin/haproxy
Oct  4 01:53:14 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [WARNING]  (234008) : Exiting Master process...
Oct  4 01:53:14 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [ALERT]    (234008) : Current worker (234010) exited with code 143 (Terminated)
Oct  4 01:53:14 np0005470441 neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96[234004]: [WARNING]  (234008) : All workers exited. Exiting... (0)
Oct  4 01:53:14 np0005470441 systemd[1]: libpod-c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77.scope: Deactivated successfully.
Oct  4 01:53:14 np0005470441 podman[234265]: 2025-10-04 05:53:14.731004523 +0000 UTC m=+0.042747010 container died c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:53:14 np0005470441 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000031.scope: Deactivated successfully.
Oct  4 01:53:14 np0005470441 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000031.scope: Consumed 13.982s CPU time.
Oct  4 01:53:14 np0005470441 systemd-machined[152624]: Machine qemu-27-instance-00000031 terminated.
Oct  4 01:53:14 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77-userdata-shm.mount: Deactivated successfully.
Oct  4 01:53:14 np0005470441 systemd[1]: var-lib-containers-storage-overlay-05c29a4f3835dd1ee7fc8588c8e3c1bd96c39f46e1830a58b222ebf172c670ca-merged.mount: Deactivated successfully.
Oct  4 01:53:14 np0005470441 podman[234265]: 2025-10-04 05:53:14.767184964 +0000 UTC m=+0.078927451 container cleanup c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:53:14 np0005470441 systemd[1]: libpod-conmon-c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77.scope: Deactivated successfully.
Oct  4 01:53:14 np0005470441 NetworkManager[51690]: <info>  [1759557194.7856] manager: (tap81781200-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Oct  4 01:53:14 np0005470441 podman[234302]: 2025-10-04 05:53:14.828386634 +0000 UTC m=+0.039543688 container remove c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.829 2 INFO nova.virt.libvirt.driver [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Instance destroyed successfully.#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.830 2 DEBUG nova.objects.instance [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid 1b4da3b1-cafb-41e8-8eae-ac240b41a891 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.833 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7feb2ad7-0758-4eb3-b935-40c7825231a5]: (4, ('Sat Oct  4 05:53:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96 (c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77)\nc7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77\nSat Oct  4 05:53:14 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96 (c7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77)\nc7a7cd6f3204cac81610408447dc803ddfad7756cb4ff0e4e08b959bb6f78f77\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.835 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4aa261-3439-4728-a210-da6d273113f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.836 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap199b9470-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 kernel: tap199b9470-50: left promiscuous mode
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.845 2 DEBUG nova.virt.libvirt.vif [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1379993306',display_name='tempest-TestGettingAddress-server-1379993306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1379993306',id=49,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTWzVp+scbD8iO/EAR4ii6lHpq8wkJIOGeyaeDGeWZ3nhAnZnCPnEzd5XscaYV5xeRNVx9GnNUiLXi7an+E96rzIAk6FQs1pn5oevAy5JYTBgyKN8p+TR1bqN3xQ1yOxQ==',key_name='tempest-TestGettingAddress-717657981',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:52:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0tx4870u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:52:51Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=1b4da3b1-cafb-41e8-8eae-ac240b41a891,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.846 2 DEBUG nova.network.os_vif_util [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.220", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.846 2 DEBUG nova.network.os_vif_util [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.847 2 DEBUG os_vif [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.848 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5671074-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.857 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe5f8aa-bb80-4666-98ff-3f0be3f2a53e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.862 2 INFO os_vif [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:26:fa:3f,bridge_name='br-int',has_traffic_filtering=True,id=a5671074-2db1-4974-bdb0-f92d8745da99,network=Network(199b9470-5ecb-408d-8026-8037800b7e96),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5671074-2d')#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.862 2 DEBUG nova.virt.libvirt.vif [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:52:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1379993306',display_name='tempest-TestGettingAddress-server-1379993306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1379993306',id=49,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFTWzVp+scbD8iO/EAR4ii6lHpq8wkJIOGeyaeDGeWZ3nhAnZnCPnEzd5XscaYV5xeRNVx9GnNUiLXi7an+E96rzIAk6FQs1pn5oevAy5JYTBgyKN8p+TR1bqN3xQ1yOxQ==',key_name='tempest-TestGettingAddress-717657981',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:52:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0tx4870u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:52:51Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=1b4da3b1-cafb-41e8-8eae-ac240b41a891,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.863 2 DEBUG nova.network.os_vif_util [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.863 2 DEBUG nova.network.os_vif_util [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.864 2 DEBUG os_vif [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81781200-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.868 2 INFO os_vif [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:ad:1e,bridge_name='br-int',has_traffic_filtering=True,id=81781200-f6a4-4764-bbeb-e0994759b68a,network=Network(f186cb58-c427-4b55-b47a-91b4fed4e8b2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81781200-f6')#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.869 2 INFO nova.virt.libvirt.driver [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Deleting instance files /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891_del#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.870 2 INFO nova.virt.libvirt.driver [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Deletion of /var/lib/nova/instances/1b4da3b1-cafb-41e8-8eae-ac240b41a891_del complete#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.879 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a3cefe65-7023-4dec-b3b5-b397320c2a64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.880 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[fdccc175-9ce5-4947-b1ac-4a9417f20b96]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.898 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6b57a9e9-66d2-4173-925d-6018b229fbaa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506789, 'reachable_time': 42921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234344, 'error': None, 'target': 'ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.901 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-199b9470-5ecb-408d-8026-8037800b7e96 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.901 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[76d30f9a-fea5-45b8-912b-1c689cbd1d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 systemd[1]: run-netns-ovnmeta\x2d199b9470\x2d5ecb\x2d408d\x2d8026\x2d8037800b7e96.mount: Deactivated successfully.
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.902 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 81781200-f6a4-4764-bbeb-e0994759b68a in datapath f186cb58-c427-4b55-b47a-91b4fed4e8b2 unbound from our chassis#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.903 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f186cb58-c427-4b55-b47a-91b4fed4e8b2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.904 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[27a60f4c-045a-436b-9c48-eab181d0597c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:14 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:14.904 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2 namespace which is not needed anymore#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.935 2 INFO nova.compute.manager [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.936 2 DEBUG oslo.service.loopingcall [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.936 2 DEBUG nova.compute.manager [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:53:14 np0005470441 nova_compute[192626]: 2025-10-04 05:53:14.936 2 DEBUG nova.network.neutron [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:53:15 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [NOTICE]   (234080) : haproxy version is 2.8.14-c23fe91
Oct  4 01:53:15 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [NOTICE]   (234080) : path to executable is /usr/sbin/haproxy
Oct  4 01:53:15 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [WARNING]  (234080) : Exiting Master process...
Oct  4 01:53:15 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [WARNING]  (234080) : Exiting Master process...
Oct  4 01:53:15 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [ALERT]    (234080) : Current worker (234082) exited with code 143 (Terminated)
Oct  4 01:53:15 np0005470441 neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2[234076]: [WARNING]  (234080) : All workers exited. Exiting... (0)
Oct  4 01:53:15 np0005470441 systemd[1]: libpod-6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5.scope: Deactivated successfully.
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.025 2 DEBUG nova.compute.manager [req-9bad655d-d86d-40bf-a0b8-f2efba4c7179 req-0aa38034-d1c2-4d73-9e5a-6c4a2cfe9a3b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-unplugged-81781200-f6a4-4764-bbeb-e0994759b68a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.025 2 DEBUG oslo_concurrency.lockutils [req-9bad655d-d86d-40bf-a0b8-f2efba4c7179 req-0aa38034-d1c2-4d73-9e5a-6c4a2cfe9a3b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.026 2 DEBUG oslo_concurrency.lockutils [req-9bad655d-d86d-40bf-a0b8-f2efba4c7179 req-0aa38034-d1c2-4d73-9e5a-6c4a2cfe9a3b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.026 2 DEBUG oslo_concurrency.lockutils [req-9bad655d-d86d-40bf-a0b8-f2efba4c7179 req-0aa38034-d1c2-4d73-9e5a-6c4a2cfe9a3b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.027 2 DEBUG nova.compute.manager [req-9bad655d-d86d-40bf-a0b8-f2efba4c7179 req-0aa38034-d1c2-4d73-9e5a-6c4a2cfe9a3b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] No waiting events found dispatching network-vif-unplugged-81781200-f6a4-4764-bbeb-e0994759b68a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:53:15 np0005470441 podman[234360]: 2025-10-04 05:53:15.027553193 +0000 UTC m=+0.043438110 container died 6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.027 2 DEBUG nova.compute.manager [req-9bad655d-d86d-40bf-a0b8-f2efba4c7179 req-0aa38034-d1c2-4d73-9e5a-6c4a2cfe9a3b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-unplugged-81781200-f6a4-4764-bbeb-e0994759b68a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:53:15 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5-userdata-shm.mount: Deactivated successfully.
Oct  4 01:53:15 np0005470441 systemd[1]: var-lib-containers-storage-overlay-e83bbbf9da96c061bf99abcc7309449bea4ce565ea896870cc8608d4f739c87d-merged.mount: Deactivated successfully.
Oct  4 01:53:15 np0005470441 podman[234360]: 2025-10-04 05:53:15.05698543 +0000 UTC m=+0.072870347 container cleanup 6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  4 01:53:15 np0005470441 systemd[1]: libpod-conmon-6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5.scope: Deactivated successfully.
Oct  4 01:53:15 np0005470441 podman[234389]: 2025-10-04 05:53:15.109724167 +0000 UTC m=+0.036132771 container remove 6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.114 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6e276b-1abb-4158-a117-b8e8e20898c5]: (4, ('Sat Oct  4 05:53:14 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2 (6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5)\n6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5\nSat Oct  4 05:53:15 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2 (6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5)\n6950f3d175b4d0fde1866120922614d761c47593168f3f89e90a93cb8336cea5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.116 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[02e22389-6ff4-45c4-bb74-24b825d27d13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.116 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf186cb58-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:15 np0005470441 kernel: tapf186cb58-c0: left promiscuous mode
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.132 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[65add55a-371e-439f-9ae4-8b54571d5218]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.179 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[13b19eb7-43ca-4db4-bcdd-cdbc1443ba10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.181 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2383bd95-f44e-4df9-a466-0cb3fe1484d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.198 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ada20b-0c8f-4682-af88-526fd6a9b2e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 506887, 'reachable_time': 28351, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234406, 'error': None, 'target': 'ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.200 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f186cb58-c427-4b55-b47a-91b4fed4e8b2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:53:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:15.200 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[1382dd68-fe10-4d33-beb9-762c499fb56d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:53:15 np0005470441 systemd[1]: run-netns-ovnmeta\x2df186cb58\x2dc427\x2d4b55\x2db47a\x2d91b4fed4e8b2.mount: Deactivated successfully.
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.975 2 DEBUG nova.network.neutron [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updated VIF entry in instance network info cache for port a5671074-2db1-4974-bdb0-f92d8745da99. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.976 2 DEBUG nova.network.neutron [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81781200-f6a4-4764-bbeb-e0994759b68a", "address": "fa:16:3e:38:ad:1e", "network": {"id": "f186cb58-c427-4b55-b47a-91b4fed4e8b2", "bridge": "br-int", "label": "tempest-network-smoke--140182709", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe38:ad1e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81781200-f6", "ovs_interfaceid": "81781200-f6a4-4764-bbeb-e0994759b68a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:53:15 np0005470441 nova_compute[192626]: 2025-10-04 05:53:15.998 2 DEBUG oslo_concurrency.lockutils [req-380534ae-2a41-4e3c-8c01-f67e7c8043b8 req-9555e645-f179-4d5b-9375-377a917347c1 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-1b4da3b1-cafb-41e8-8eae-ac240b41a891" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.546 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-unplugged-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.547 2 DEBUG oslo_concurrency.lockutils [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.547 2 DEBUG oslo_concurrency.lockutils [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.547 2 DEBUG oslo_concurrency.lockutils [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.548 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] No waiting events found dispatching network-vif-unplugged-a5671074-2db1-4974-bdb0-f92d8745da99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.548 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-unplugged-a5671074-2db1-4974-bdb0-f92d8745da99 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.548 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.549 2 DEBUG oslo_concurrency.lockutils [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.549 2 DEBUG oslo_concurrency.lockutils [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.549 2 DEBUG oslo_concurrency.lockutils [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.549 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] No waiting events found dispatching network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.550 2 WARNING nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received unexpected event network-vif-plugged-a5671074-2db1-4974-bdb0-f92d8745da99 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.550 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-deleted-81781200-f6a4-4764-bbeb-e0994759b68a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.550 2 INFO nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Neutron deleted interface 81781200-f6a4-4764-bbeb-e0994759b68a; detaching it from the instance and deleting it from the info cache#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.550 2 DEBUG nova.network.neutron [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [{"id": "a5671074-2db1-4974-bdb0-f92d8745da99", "address": "fa:16:3e:26:fa:3f", "network": {"id": "199b9470-5ecb-408d-8026-8037800b7e96", "bridge": "br-int", "label": "tempest-network-smoke--1093913608", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5671074-2d", "ovs_interfaceid": "a5671074-2db1-4974-bdb0-f92d8745da99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.577 2 DEBUG nova.compute.manager [req-0d5327cd-6013-408b-b382-51dbde94d7ab req-0327763f-0471-4ba3-8878-925aa714133f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Detach interface failed, port_id=81781200-f6a4-4764-bbeb-e0994759b68a, reason: Instance 1b4da3b1-cafb-41e8-8eae-ac240b41a891 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.753 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.754 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.754 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.845 2 DEBUG nova.network.neutron [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:53:16 np0005470441 nova_compute[192626]: 2025-10-04 05:53:16.921 2 INFO nova.compute.manager [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Took 1.98 seconds to deallocate network for instance.#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.063 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.065 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.124 2 DEBUG nova.compute.manager [req-450abb5b-c870-4294-94f1-a1a8a49ef9d3 req-b3962d43-6730-4cd9-bca3-b458f357a3cd 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.125 2 DEBUG oslo_concurrency.lockutils [req-450abb5b-c870-4294-94f1-a1a8a49ef9d3 req-b3962d43-6730-4cd9-bca3-b458f357a3cd 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.126 2 DEBUG oslo_concurrency.lockutils [req-450abb5b-c870-4294-94f1-a1a8a49ef9d3 req-b3962d43-6730-4cd9-bca3-b458f357a3cd 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.126 2 DEBUG oslo_concurrency.lockutils [req-450abb5b-c870-4294-94f1-a1a8a49ef9d3 req-b3962d43-6730-4cd9-bca3-b458f357a3cd 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.127 2 DEBUG nova.compute.manager [req-450abb5b-c870-4294-94f1-a1a8a49ef9d3 req-b3962d43-6730-4cd9-bca3-b458f357a3cd 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] No waiting events found dispatching network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.127 2 WARNING nova.compute.manager [req-450abb5b-c870-4294-94f1-a1a8a49ef9d3 req-b3962d43-6730-4cd9-bca3-b458f357a3cd 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received unexpected event network-vif-plugged-81781200-f6a4-4764-bbeb-e0994759b68a for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.135 2 DEBUG nova.compute.provider_tree [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.181 2 DEBUG nova.scheduler.client.report [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.229 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.264 2 INFO nova.scheduler.client.report [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance 1b4da3b1-cafb-41e8-8eae-ac240b41a891#033[00m
Oct  4 01:53:17 np0005470441 nova_compute[192626]: 2025-10-04 05:53:17.367 2 DEBUG oslo_concurrency.lockutils [None req-a9748fef-b032-42fb-9dbc-68cea8481f2b 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "1b4da3b1-cafb-41e8-8eae-ac240b41a891" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.816s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:18 np0005470441 podman[234408]: 2025-10-04 05:53:18.315725793 +0000 UTC m=+0.067431900 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Oct  4 01:53:18 np0005470441 podman[234407]: 2025-10-04 05:53:18.33020334 +0000 UTC m=+0.075746230 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.654 2 DEBUG nova.compute.manager [req-52d0df51-1bbf-4293-a0ff-2c746b5f3596 req-f74a5bdc-3222-496a-ba77-3620129a90de 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Received event network-vif-deleted-a5671074-2db1-4974-bdb0-f92d8745da99 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.751 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.751 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.752 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.752 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.951 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.953 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5723MB free_disk=73.42042541503906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.953 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:53:18 np0005470441 nova_compute[192626]: 2025-10-04 05:53:18.953 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.014 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.015 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.038 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.063 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.158 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.158 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:53:19 np0005470441 nova_compute[192626]: 2025-10-04 05:53:19.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:21 np0005470441 nova_compute[192626]: 2025-10-04 05:53:21.159 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:21 np0005470441 nova_compute[192626]: 2025-10-04 05:53:21.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:23 np0005470441 nova_compute[192626]: 2025-10-04 05:53:23.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:23 np0005470441 nova_compute[192626]: 2025-10-04 05:53:23.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:24 np0005470441 podman[234450]: 2025-10-04 05:53:24.305631876 +0000 UTC m=+0.056904608 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41)
Oct  4 01:53:24 np0005470441 nova_compute[192626]: 2025-10-04 05:53:24.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:53:24 np0005470441 nova_compute[192626]: 2025-10-04 05:53:24.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:28 np0005470441 nova_compute[192626]: 2025-10-04 05:53:28.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:29 np0005470441 nova_compute[192626]: 2025-10-04 05:53:29.827 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759557194.8266125, 1b4da3b1-cafb-41e8-8eae-ac240b41a891 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:53:29 np0005470441 nova_compute[192626]: 2025-10-04 05:53:29.828 2 INFO nova.compute.manager [-] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:53:29 np0005470441 nova_compute[192626]: 2025-10-04 05:53:29.908 2 DEBUG nova.compute.manager [None req-d036d0b2-f902-410f-9b07-bcbbfda6b49d - - - - - -] [instance: 1b4da3b1-cafb-41e8-8eae-ac240b41a891] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:53:29 np0005470441 nova_compute[192626]: 2025-10-04 05:53:29.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:30 np0005470441 podman[234471]: 2025-10-04 05:53:30.310788176 +0000 UTC m=+0.056294820 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:53:32 np0005470441 podman[234495]: 2025-10-04 05:53:32.316648683 +0000 UTC m=+0.059059980 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 01:53:33 np0005470441 nova_compute[192626]: 2025-10-04 05:53:33.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:34 np0005470441 nova_compute[192626]: 2025-10-04 05:53:34.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:35 np0005470441 podman[234514]: 2025-10-04 05:53:35.327420875 +0000 UTC m=+0.082454863 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:53:35 np0005470441 nova_compute[192626]: 2025-10-04 05:53:35.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:35 np0005470441 nova_compute[192626]: 2025-10-04 05:53:35.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:38 np0005470441 nova_compute[192626]: 2025-10-04 05:53:38.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:39 np0005470441 nova_compute[192626]: 2025-10-04 05:53:39.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:43 np0005470441 nova_compute[192626]: 2025-10-04 05:53:43.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:43.089 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:53:43 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:43.091 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:53:43 np0005470441 nova_compute[192626]: 2025-10-04 05:53:43.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:44 np0005470441 podman[234544]: 2025-10-04 05:53:44.300262459 +0000 UTC m=+0.051354818 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:53:44 np0005470441 podman[234543]: 2025-10-04 05:53:44.300234028 +0000 UTC m=+0.052948484 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=multipathd)
Oct  4 01:53:44 np0005470441 nova_compute[192626]: 2025-10-04 05:53:44.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:48 np0005470441 nova_compute[192626]: 2025-10-04 05:53:48.717 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:49 np0005470441 podman[234588]: 2025-10-04 05:53:49.315063144 +0000 UTC m=+0.062448618 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:53:49 np0005470441 podman[234587]: 2025-10-04 05:53:49.31528192 +0000 UTC m=+0.063120157 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:53:49 np0005470441 nova_compute[192626]: 2025-10-04 05:53:49.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:53:50.093 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:53:53 np0005470441 nova_compute[192626]: 2025-10-04 05:53:53.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:54 np0005470441 nova_compute[192626]: 2025-10-04 05:53:54.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:53:55 np0005470441 podman[234629]: 2025-10-04 05:53:55.322589523 +0000 UTC m=+0.080316351 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, name=ubi9-minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Oct  4 01:53:58 np0005470441 nova_compute[192626]: 2025-10-04 05:53:58.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:00 np0005470441 nova_compute[192626]: 2025-10-04 05:54:00.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:01 np0005470441 podman[234651]: 2025-10-04 05:54:01.313410822 +0000 UTC m=+0.059116391 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.712 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:54:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:54:03 np0005470441 podman[234673]: 2025-10-04 05:54:03.305482392 +0000 UTC m=+0.058941146 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  4 01:54:03 np0005470441 nova_compute[192626]: 2025-10-04 05:54:03.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:05 np0005470441 nova_compute[192626]: 2025-10-04 05:54:05.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:06 np0005470441 podman[234694]: 2025-10-04 05:54:06.355177883 +0000 UTC m=+0.100399559 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:54:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:06.763 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:06.764 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:06.764 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.712 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.713 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.742 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.826 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.827 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.836 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.837 2 INFO nova.compute.claims [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.944 2 DEBUG nova.compute.provider_tree [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:54:08 np0005470441 nova_compute[192626]: 2025-10-04 05:54:08.976 2 DEBUG nova.scheduler.client.report [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.011 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.013 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.067 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.068 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.093 2 INFO nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.123 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.250 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.251 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.252 2 INFO nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Creating image(s)#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.252 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.252 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.253 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.267 2 DEBUG nova.policy [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.269 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.354 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.355 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.356 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.367 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.433 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.434 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.467 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.468 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.469 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.545 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.547 2 DEBUG nova.virt.disk.api [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.547 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.602 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.603 2 DEBUG nova.virt.disk.api [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.603 2 DEBUG nova.objects.instance [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid 40b23805-043e-4739-93e5-5c4ad06de4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.830 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.831 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Ensure instance console log exists: /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.831 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.832 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:09 np0005470441 nova_compute[192626]: 2025-10-04 05:54:09.832 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:10 np0005470441 nova_compute[192626]: 2025-10-04 05:54:10.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:10 np0005470441 nova_compute[192626]: 2025-10-04 05:54:10.528 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Successfully created port: a9f4a938-7289-4e23-8b45-14e8de123fe9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:54:11 np0005470441 nova_compute[192626]: 2025-10-04 05:54:11.726 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Successfully created port: 37fcecd9-fd14-45e1-9158-6cafb17d69ed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.377 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Successfully updated port: a9f4a938-7289-4e23-8b45-14e8de123fe9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.511 2 DEBUG nova.compute.manager [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-changed-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.511 2 DEBUG nova.compute.manager [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing instance network info cache due to event network-changed-a9f4a938-7289-4e23-8b45-14e8de123fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.512 2 DEBUG oslo_concurrency.lockutils [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.512 2 DEBUG oslo_concurrency.lockutils [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.512 2 DEBUG nova.network.neutron [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing network info cache for port a9f4a938-7289-4e23-8b45-14e8de123fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.705 2 DEBUG nova.network.neutron [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:54:13 np0005470441 nova_compute[192626]: 2025-10-04 05:54:13.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.065 2 DEBUG nova.network.neutron [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.080 2 DEBUG oslo_concurrency.lockutils [req-bf423469-9278-4bea-8d81-1d1f2d0aef35 req-a5645918-0bb3-423a-9f3e-320b64ae120d 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.337 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Successfully updated port: 37fcecd9-fd14-45e1-9158-6cafb17d69ed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.357 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.357 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.357 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:54:14 np0005470441 nova_compute[192626]: 2025-10-04 05:54:14.586 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:54:15 np0005470441 nova_compute[192626]: 2025-10-04 05:54:15.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:15 np0005470441 podman[234737]: 2025-10-04 05:54:15.303421959 +0000 UTC m=+0.060599204 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:54:15 np0005470441 podman[234738]: 2025-10-04 05:54:15.307409514 +0000 UTC m=+0.050859704 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:54:15 np0005470441 nova_compute[192626]: 2025-10-04 05:54:15.604 2 DEBUG nova.compute.manager [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-changed-37fcecd9-fd14-45e1-9158-6cafb17d69ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:15 np0005470441 nova_compute[192626]: 2025-10-04 05:54:15.605 2 DEBUG nova.compute.manager [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing instance network info cache due to event network-changed-37fcecd9-fd14-45e1-9158-6cafb17d69ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:54:15 np0005470441 nova_compute[192626]: 2025-10-04 05:54:15.605 2 DEBUG oslo_concurrency.lockutils [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.533 2 DEBUG nova.network.neutron [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.552 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.553 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance network_info: |[{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.553 2 DEBUG oslo_concurrency.lockutils [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.554 2 DEBUG nova.network.neutron [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing network info cache for port 37fcecd9-fd14-45e1-9158-6cafb17d69ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.558 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Start _get_guest_xml network_info=[{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.562 2 WARNING nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.567 2 DEBUG nova.virt.libvirt.host [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.567 2 DEBUG nova.virt.libvirt.host [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.576 2 DEBUG nova.virt.libvirt.host [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.577 2 DEBUG nova.virt.libvirt.host [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.578 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.579 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.579 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.580 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.580 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.580 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.581 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.581 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.581 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.582 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.582 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.582 2 DEBUG nova.virt.hardware [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.587 2 DEBUG nova.virt.libvirt.vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-650600854',display_name='tempest-TestGettingAddress-server-650600854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-650600854',id=50,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBujPvJHislz+5SzrEeq6/FdWqvF27U93knVnZePuAPZbALR7y51XFzlnmqcf+Kj+V02Lo5+hXLWlZrBE5bZHlBSuEtwiQL4a52+jJz/LdzSzyF+JCdiDN3JjvlcS2cD3g==',key_name='tempest-TestGettingAddress-1602568233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-g69olte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:54:09Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=40b23805-043e-4739-93e5-5c4ad06de4a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.587 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.588 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.589 2 DEBUG nova.virt.libvirt.vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-650600854',display_name='tempest-TestGettingAddress-server-650600854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-650600854',id=50,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBujPvJHislz+5SzrEeq6/FdWqvF27U93knVnZePuAPZbALR7y51XFzlnmqcf+Kj+V02Lo5+hXLWlZrBE5bZHlBSuEtwiQL4a52+jJz/LdzSzyF+JCdiDN3JjvlcS2cD3g==',key_name='tempest-TestGettingAddress-1602568233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-g69olte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:54:09Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=40b23805-043e-4739-93e5-5c4ad06de4a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.590 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.590 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.591 2 DEBUG nova.objects.instance [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 40b23805-043e-4739-93e5-5c4ad06de4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.606 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <uuid>40b23805-043e-4739-93e5-5c4ad06de4a9</uuid>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <name>instance-00000032</name>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-650600854</nova:name>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:54:16</nova:creationTime>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:port uuid="a9f4a938-7289-4e23-8b45-14e8de123fe9">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        <nova:port uuid="37fcecd9-fd14-45e1-9158-6cafb17d69ed">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe51:412e" ipVersion="6"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <entry name="serial">40b23805-043e-4739-93e5-5c4ad06de4a9</entry>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <entry name="uuid">40b23805-043e-4739-93e5-5c4ad06de4a9</entry>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.config"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:b1:14:da"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <target dev="tapa9f4a938-72"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:51:41:2e"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <target dev="tap37fcecd9-fd"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/console.log" append="off"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:54:16 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:54:16 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:54:16 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:54:16 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.607 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Preparing to wait for external event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.608 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.608 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.608 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.609 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Preparing to wait for external event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.609 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.609 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.609 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.610 2 DEBUG nova.virt.libvirt.vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-650600854',display_name='tempest-TestGettingAddress-server-650600854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-650600854',id=50,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBujPvJHislz+5SzrEeq6/FdWqvF27U93knVnZePuAPZbALR7y51XFzlnmqcf+Kj+V02Lo5+hXLWlZrBE5bZHlBSuEtwiQL4a52+jJz/LdzSzyF+JCdiDN3JjvlcS2cD3g==',key_name='tempest-TestGettingAddress-1602568233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-g69olte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:54:09Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=40b23805-043e-4739-93e5-5c4ad06de4a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.610 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.611 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.611 2 DEBUG os_vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.612 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.613 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.617 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9f4a938-72, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.617 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa9f4a938-72, col_values=(('external_ids', {'iface-id': 'a9f4a938-7289-4e23-8b45-14e8de123fe9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:14:da', 'vm-uuid': '40b23805-043e-4739-93e5-5c4ad06de4a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 NetworkManager[51690]: <info>  [1759557256.6199] manager: (tapa9f4a938-72): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/179)
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.627 2 INFO os_vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72')#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.628 2 DEBUG nova.virt.libvirt.vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-650600854',display_name='tempest-TestGettingAddress-server-650600854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-650600854',id=50,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBujPvJHislz+5SzrEeq6/FdWqvF27U93knVnZePuAPZbALR7y51XFzlnmqcf+Kj+V02Lo5+hXLWlZrBE5bZHlBSuEtwiQL4a52+jJz/LdzSzyF+JCdiDN3JjvlcS2cD3g==',key_name='tempest-TestGettingAddress-1602568233',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-g69olte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:54:09Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=40b23805-043e-4739-93e5-5c4ad06de4a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.629 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.630 2 DEBUG nova.network.os_vif_util [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.630 2 DEBUG os_vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.631 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.631 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.634 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap37fcecd9-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.634 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap37fcecd9-fd, col_values=(('external_ids', {'iface-id': '37fcecd9-fd14-45e1-9158-6cafb17d69ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:41:2e', 'vm-uuid': '40b23805-043e-4739-93e5-5c4ad06de4a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 NetworkManager[51690]: <info>  [1759557256.6366] manager: (tap37fcecd9-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.643 2 INFO os_vif [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd')#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.731 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.732 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.733 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:b1:14:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.733 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:51:41:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:54:16 np0005470441 nova_compute[192626]: 2025-10-04 05:54:16.734 2 INFO nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Using config drive#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.058 2 INFO nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Creating config drive at /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.config#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.064 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo0a5i_ga execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.190 2 DEBUG oslo_concurrency.processutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo0a5i_ga" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.2637] manager: (tapa9f4a938-72): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Oct  4 01:54:17 np0005470441 kernel: tapa9f4a938-72: entered promiscuous mode
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00367|binding|INFO|Claiming lport a9f4a938-7289-4e23-8b45-14e8de123fe9 for this chassis.
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00368|binding|INFO|a9f4a938-7289-4e23-8b45-14e8de123fe9: Claiming fa:16:3e:b1:14:da 10.100.0.10
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.2863] manager: (tap37fcecd9-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/182)
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.292 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:14:da 10.100.0.10'], port_security=['fa:16:3e:b1:14:da 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '40b23805-043e-4739-93e5-5c4ad06de4a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1301a60e-039b-4195-9e51-48a134fdaa07', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abf05fc2-0c1c-4292-b3db-76f51430cc4e, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a9f4a938-7289-4e23-8b45-14e8de123fe9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.293 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a9f4a938-7289-4e23-8b45-14e8de123fe9 in datapath c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 bound to our chassis#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.294 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c23d2be2-c453-4bb0-bd2f-7e68f2b62db4#033[00m
Oct  4 01:54:17 np0005470441 systemd-udevd[234807]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:54:17 np0005470441 systemd-udevd[234805]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.308 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[625dae74-8759-4a54-9caf-de0184e06514]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.309 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc23d2be2-c1 in ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.312 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc23d2be2-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.313 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[46a3ba67-6d8a-4c19-9793-3c1a912dd94e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.314 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5224b224-f34b-400e-99c3-031f06bc88b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.3183] device (tapa9f4a938-72): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.3193] device (tapa9f4a938-72): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.329 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[630935d6-7b14-43ba-b4b6-a0c644b2b426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 systemd-machined[152624]: New machine qemu-28-instance-00000032.
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.358 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a96fabcf-54b4-4e1f-a136-ad4df88eecfb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 kernel: tap37fcecd9-fd: entered promiscuous mode
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.3681] device (tap37fcecd9-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.3703] device (tap37fcecd9-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00369|binding|INFO|Setting lport a9f4a938-7289-4e23-8b45-14e8de123fe9 ovn-installed in OVS
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00370|binding|INFO|Setting lport a9f4a938-7289-4e23-8b45-14e8de123fe9 up in Southbound
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.391 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[0e6824d6-6248-4e40-b55c-2fa96bc98226]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00371|binding|INFO|Claiming lport 37fcecd9-fd14-45e1-9158-6cafb17d69ed for this chassis.
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00372|binding|INFO|37fcecd9-fd14-45e1-9158-6cafb17d69ed: Claiming fa:16:3e:51:41:2e 2001:db8::f816:3eff:fe51:412e
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.406 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[0a0b8935-2782-4579-9175-2e6a53ad99af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.4089] manager: (tapc23d2be2-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/183)
Oct  4 01:54:17 np0005470441 systemd[1]: Started Virtual Machine qemu-28-instance-00000032.
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00373|binding|INFO|Setting lport 37fcecd9-fd14-45e1-9158-6cafb17d69ed ovn-installed in OVS
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00374|binding|INFO|Setting lport 37fcecd9-fd14-45e1-9158-6cafb17d69ed up in Southbound
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.422 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:41:2e 2001:db8::f816:3eff:fe51:412e'], port_security=['fa:16:3e:51:41:2e 2001:db8::f816:3eff:fe51:412e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe51:412e/64', 'neutron:device_id': '40b23805-043e-4739-93e5-5c4ad06de4a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1301a60e-039b-4195-9e51-48a134fdaa07', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=251ac611-cc92-44c8-b957-057bc1d81924, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=37fcecd9-fd14-45e1-9158-6cafb17d69ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.444 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdd2116-e19d-4dad-a817-b9d5c12662d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.447 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[120278fe-25dc-485d-b0ee-54f279c140ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.4729] device (tapc23d2be2-c0): carrier: link connected
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.479 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[2bf6af6e-c452-42e1-9ba6-17634a87503a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.497 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb6b1aa-1e18-4047-bc23-7c5bd58fc808]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc23d2be2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:57:a6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515512, 'reachable_time': 23731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234842, 'error': None, 'target': 'ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.516 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9bc9fd16-02fc-4be7-bb7a-00a48a7fba27]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:57a6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515512, 'tstamp': 515512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234843, 'error': None, 'target': 'ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.543 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[10b090d2-3316-4369-bb52-96a6e0234f7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc23d2be2-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:57:a6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 104], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515512, 'reachable_time': 23731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234845, 'error': None, 'target': 'ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.579 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3b612033-907b-438d-bda9-ced5890a3b82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.659 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[02393df2-cd3f-4577-91f8-96a709e53b8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.661 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc23d2be2-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.662 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.663 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc23d2be2-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:17 np0005470441 NetworkManager[51690]: <info>  [1759557257.6662] manager: (tapc23d2be2-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:17 np0005470441 kernel: tapc23d2be2-c0: entered promiscuous mode
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.671 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc23d2be2-c0, col_values=(('external_ids', {'iface-id': '94849053-a5a5-46ee-8745-b7cafd5a75a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:17Z|00375|binding|INFO|Releasing lport 94849053-a5a5-46ee-8745-b7cafd5a75a0 from this chassis (sb_readonly=0)
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.700 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c23d2be2-c453-4bb0-bd2f-7e68f2b62db4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c23d2be2-c453-4bb0-bd2f-7e68f2b62db4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.701 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[45e2e2af-4d8f-486e-bd00-831fc06abd3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.701 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/c23d2be2-c453-4bb0-bd2f-7e68f2b62db4.pid.haproxy
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID c23d2be2-c453-4bb0-bd2f-7e68f2b62db4
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:54:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:17.702 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'env', 'PROCESS_TAG=haproxy-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c23d2be2-c453-4bb0-bd2f-7e68f2b62db4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.799 2 DEBUG nova.compute.manager [req-9a3c81d3-05e5-49b8-b763-b825dcc2fafb req-aeeb0d68-2d20-4082-afde-e614663f9913 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.799 2 DEBUG oslo_concurrency.lockutils [req-9a3c81d3-05e5-49b8-b763-b825dcc2fafb req-aeeb0d68-2d20-4082-afde-e614663f9913 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.800 2 DEBUG oslo_concurrency.lockutils [req-9a3c81d3-05e5-49b8-b763-b825dcc2fafb req-aeeb0d68-2d20-4082-afde-e614663f9913 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.800 2 DEBUG oslo_concurrency.lockutils [req-9a3c81d3-05e5-49b8-b763-b825dcc2fafb req-aeeb0d68-2d20-4082-afde-e614663f9913 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.800 2 DEBUG nova.compute.manager [req-9a3c81d3-05e5-49b8-b763-b825dcc2fafb req-aeeb0d68-2d20-4082-afde-e614663f9913 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Processing event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.901 2 DEBUG nova.compute.manager [req-47798313-503d-4f79-ba6f-ec964b021039 req-0e196f3f-d8ed-494f-835f-9a0e1fb60078 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.901 2 DEBUG oslo_concurrency.lockutils [req-47798313-503d-4f79-ba6f-ec964b021039 req-0e196f3f-d8ed-494f-835f-9a0e1fb60078 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.902 2 DEBUG oslo_concurrency.lockutils [req-47798313-503d-4f79-ba6f-ec964b021039 req-0e196f3f-d8ed-494f-835f-9a0e1fb60078 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.902 2 DEBUG oslo_concurrency.lockutils [req-47798313-503d-4f79-ba6f-ec964b021039 req-0e196f3f-d8ed-494f-835f-9a0e1fb60078 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:17 np0005470441 nova_compute[192626]: 2025-10-04 05:54:17.903 2 DEBUG nova.compute.manager [req-47798313-503d-4f79-ba6f-ec964b021039 req-0e196f3f-d8ed-494f-835f-9a0e1fb60078 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Processing event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.110 2 DEBUG nova.network.neutron [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updated VIF entry in instance network info cache for port 37fcecd9-fd14-45e1-9158-6cafb17d69ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.111 2 DEBUG nova.network.neutron [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.133 2 DEBUG oslo_concurrency.lockutils [req-f5c741d6-fb37-4918-b027-4258b991f752 req-644e88ef-3119-4ba1-ac07-31dfa8a68112 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:54:18 np0005470441 podman[234885]: 2025-10-04 05:54:18.081716664 +0000 UTC m=+0.021528070 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:54:18 np0005470441 podman[234885]: 2025-10-04 05:54:18.230634457 +0000 UTC m=+0.170445863 container create 90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  4 01:54:18 np0005470441 systemd[1]: Started libpod-conmon-90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f.scope.
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.292 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557258.2923317, 40b23805-043e-4739-93e5-5c4ad06de4a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.293 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] VM Started (Lifecycle Event)#033[00m
Oct  4 01:54:18 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.297 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.301 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:54:18 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/34db02ff90e711257fad657c32a678df40adf6456a22911907362e74c8d6980f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.304 2 INFO nova.virt.libvirt.driver [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance spawned successfully.#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.305 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.315 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.318 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.344 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.344 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.345 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.346 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.346 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.347 2 DEBUG nova.virt.libvirt.driver [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.353 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.354 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557258.2924464, 40b23805-043e-4739-93e5-5c4ad06de4a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.354 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:54:18 np0005470441 podman[234885]: 2025-10-04 05:54:18.365458885 +0000 UTC m=+0.305270261 container init 90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  4 01:54:18 np0005470441 podman[234885]: 2025-10-04 05:54:18.37604763 +0000 UTC m=+0.315859006 container start 90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.393 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:54:18 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [NOTICE]   (234905) : New worker (234907) forked
Oct  4 01:54:18 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [NOTICE]   (234905) : Loading success.
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.398 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557258.2995372, 40b23805-043e-4739-93e5-5c4ad06de4a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.398 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.423 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.427 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.435 2 INFO nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Took 9.18 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.436 2 DEBUG nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.445 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 37fcecd9-fd14-45e1-9158-6cafb17d69ed in datapath b003f74b-e8ab-45f4-bc8e-1821d74186e4 unbound from our chassis#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.447 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b003f74b-e8ab-45f4-bc8e-1821d74186e4#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.459 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[85903b9c-93f9-43a7-9828-d4a252dd9d0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.460 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb003f74b-e1 in ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.461 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb003f74b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.461 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[88f8b513-9caa-4efd-bdbb-6f6446ba644e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.462 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5b62d5cf-59e2-49d0-aeaf-d55a44dea01c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.474 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.477 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[1a28cd8b-2438-4331-93aa-55b2783f0b40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.502 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b4acc352-4437-4548-98c2-0f8fd352e5e6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.527 2 INFO nova.compute.manager [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Took 9.73 seconds to build instance.#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.531 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[d90b39ba-fe33-4836-af84-0c85ba366fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 NetworkManager[51690]: <info>  [1759557258.5376] manager: (tapb003f74b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/185)
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.539 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[586317a5-5dfd-4b70-82b1-86725dcd13b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.552 2 DEBUG oslo_concurrency.lockutils [None req-f867240d-461d-4851-a73c-6fdfc639946e 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.839s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.586 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a46caca7-0c07-432f-bb2b-f1e1c1035da8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.590 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e168c9-60ed-46bd-9264-67eb3a65d70f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 NetworkManager[51690]: <info>  [1759557258.6213] device (tapb003f74b-e0): carrier: link connected
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.629 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[9037e6ab-87ff-4c3b-9a34-073778fab419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.653 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5839fb27-af16-4062-9b49-f02be31da21a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb003f74b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:fc:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515627, 'reachable_time': 41404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234926, 'error': None, 'target': 'ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.674 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[2c1baaeb-bc5e-4479-876c-d069a61e5432]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb2:fcc2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 515627, 'tstamp': 515627}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234927, 'error': None, 'target': 'ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.703 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcc7dbe-2432-419b-ba96-cec35bb96b72]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb003f74b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b2:fc:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 105], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515627, 'reachable_time': 41404, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234928, 'error': None, 'target': 'ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.719 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.742 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bceaff6b-d6c5-4be4-96b2-4e0b91d2c27a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.775 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d673061b-5aba-4637-8962-6a92337efa34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.776 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb003f74b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.777 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.778 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb003f74b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:18 np0005470441 kernel: tapb003f74b-e0: entered promiscuous mode
Oct  4 01:54:18 np0005470441 NetworkManager[51690]: <info>  [1759557258.7817] manager: (tapb003f74b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/186)
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.788 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb003f74b-e0, col_values=(('external_ids', {'iface-id': 'f67d5b0a-0486-4198-ad57-1bac4b594baa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:18 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:18Z|00376|binding|INFO|Releasing lport f67d5b0a-0486-4198-ad57-1bac4b594baa from this chassis (sb_readonly=0)
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.813 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b003f74b-e8ab-45f4-bc8e-1821d74186e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b003f74b-e8ab-45f4-bc8e-1821d74186e4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.814 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a329a3fe-ef03-444f-97d1-3664d7d3c25d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.815 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-b003f74b-e8ab-45f4-bc8e-1821d74186e4
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/b003f74b-e8ab-45f4-bc8e-1821d74186e4.pid.haproxy
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID b003f74b-e8ab-45f4-bc8e-1821d74186e4
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:54:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:18.816 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'env', 'PROCESS_TAG=haproxy-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b003f74b-e8ab-45f4-bc8e-1821d74186e4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.957 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.957 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.957 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:54:18 np0005470441 nova_compute[192626]: 2025-10-04 05:54:18.957 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 40b23805-043e-4739-93e5-5c4ad06de4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:54:19 np0005470441 podman[234959]: 2025-10-04 05:54:19.222148168 +0000 UTC m=+0.027821632 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:54:19 np0005470441 podman[234959]: 2025-10-04 05:54:19.775986077 +0000 UTC m=+0.581659511 container create 6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:54:19 np0005470441 systemd[1]: Started libpod-conmon-6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0.scope.
Oct  4 01:54:19 np0005470441 podman[234973]: 2025-10-04 05:54:19.876228921 +0000 UTC m=+0.066013870 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:54:19 np0005470441 nova_compute[192626]: 2025-10-04 05:54:19.892 2 DEBUG nova.compute.manager [req-18711d9c-1462-49f3-b37a-327584264bc3 req-0bdfb7e3-4605-4f49-adfd-ecdc4391f480 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:19 np0005470441 nova_compute[192626]: 2025-10-04 05:54:19.892 2 DEBUG oslo_concurrency.lockutils [req-18711d9c-1462-49f3-b37a-327584264bc3 req-0bdfb7e3-4605-4f49-adfd-ecdc4391f480 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:19 np0005470441 nova_compute[192626]: 2025-10-04 05:54:19.892 2 DEBUG oslo_concurrency.lockutils [req-18711d9c-1462-49f3-b37a-327584264bc3 req-0bdfb7e3-4605-4f49-adfd-ecdc4391f480 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:19 np0005470441 nova_compute[192626]: 2025-10-04 05:54:19.893 2 DEBUG oslo_concurrency.lockutils [req-18711d9c-1462-49f3-b37a-327584264bc3 req-0bdfb7e3-4605-4f49-adfd-ecdc4391f480 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:19 np0005470441 nova_compute[192626]: 2025-10-04 05:54:19.893 2 DEBUG nova.compute.manager [req-18711d9c-1462-49f3-b37a-327584264bc3 req-0bdfb7e3-4605-4f49-adfd-ecdc4391f480 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] No waiting events found dispatching network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:54:19 np0005470441 nova_compute[192626]: 2025-10-04 05:54:19.893 2 WARNING nova.compute.manager [req-18711d9c-1462-49f3-b37a-327584264bc3 req-0bdfb7e3-4605-4f49-adfd-ecdc4391f480 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received unexpected event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:54:19 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:54:19 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a13965b553bcd0c1abaa423b91c510419284efa17e8da866ba475fb1a1f39bbe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:54:19 np0005470441 podman[234972]: 2025-10-04 05:54:19.93253779 +0000 UTC m=+0.122448343 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid)
Oct  4 01:54:20 np0005470441 nova_compute[192626]: 2025-10-04 05:54:20.002 2 DEBUG nova.compute.manager [req-3b34ca3e-17e5-4c40-9109-232a4dfb3af0 req-63dc8ad1-55e4-499b-a129-aa3be20761fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:20 np0005470441 nova_compute[192626]: 2025-10-04 05:54:20.003 2 DEBUG oslo_concurrency.lockutils [req-3b34ca3e-17e5-4c40-9109-232a4dfb3af0 req-63dc8ad1-55e4-499b-a129-aa3be20761fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:20 np0005470441 nova_compute[192626]: 2025-10-04 05:54:20.003 2 DEBUG oslo_concurrency.lockutils [req-3b34ca3e-17e5-4c40-9109-232a4dfb3af0 req-63dc8ad1-55e4-499b-a129-aa3be20761fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:20 np0005470441 nova_compute[192626]: 2025-10-04 05:54:20.003 2 DEBUG oslo_concurrency.lockutils [req-3b34ca3e-17e5-4c40-9109-232a4dfb3af0 req-63dc8ad1-55e4-499b-a129-aa3be20761fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:20 np0005470441 nova_compute[192626]: 2025-10-04 05:54:20.004 2 DEBUG nova.compute.manager [req-3b34ca3e-17e5-4c40-9109-232a4dfb3af0 req-63dc8ad1-55e4-499b-a129-aa3be20761fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] No waiting events found dispatching network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:54:20 np0005470441 nova_compute[192626]: 2025-10-04 05:54:20.004 2 WARNING nova.compute.manager [req-3b34ca3e-17e5-4c40-9109-232a4dfb3af0 req-63dc8ad1-55e4-499b-a129-aa3be20761fe 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received unexpected event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed for instance with vm_state active and task_state None.#033[00m
Oct  4 01:54:20 np0005470441 podman[234959]: 2025-10-04 05:54:20.061323065 +0000 UTC m=+0.866996519 container init 6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  4 01:54:20 np0005470441 podman[234959]: 2025-10-04 05:54:20.06879182 +0000 UTC m=+0.874465254 container start 6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:54:20 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [NOTICE]   (235019) : New worker (235021) forked
Oct  4 01:54:20 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [NOTICE]   (235019) : Loading success.
Oct  4 01:54:21 np0005470441 nova_compute[192626]: 2025-10-04 05:54:21.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.818 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.853 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.876 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.877 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.878 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.879 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.879 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.905 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.906 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.907 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.907 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:54:23 np0005470441 nova_compute[192626]: 2025-10-04 05:54:23.975 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.065 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.066 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.131 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.323 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.325 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5570MB free_disk=73.41958618164062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.326 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.326 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.409 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 40b23805-043e-4739-93e5-5c4ad06de4a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.410 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.410 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.455 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.690 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.721 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:54:24 np0005470441 nova_compute[192626]: 2025-10-04 05:54:24.722 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.396s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:54:25 np0005470441 NetworkManager[51690]: <info>  [1759557265.4057] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Oct  4 01:54:25 np0005470441 NetworkManager[51690]: <info>  [1759557265.4074] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Oct  4 01:54:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:25Z|00377|binding|INFO|Releasing lport f67d5b0a-0486-4198-ad57-1bac4b594baa from this chassis (sb_readonly=0)
Oct  4 01:54:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:25Z|00378|binding|INFO|Releasing lport 94849053-a5a5-46ee-8745-b7cafd5a75a0 from this chassis (sb_readonly=0)
Oct  4 01:54:25 np0005470441 nova_compute[192626]: 2025-10-04 05:54:25.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:25Z|00379|binding|INFO|Releasing lport f67d5b0a-0486-4198-ad57-1bac4b594baa from this chassis (sb_readonly=0)
Oct  4 01:54:25 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:25Z|00380|binding|INFO|Releasing lport 94849053-a5a5-46ee-8745-b7cafd5a75a0 from this chassis (sb_readonly=0)
Oct  4 01:54:25 np0005470441 nova_compute[192626]: 2025-10-04 05:54:25.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:25 np0005470441 nova_compute[192626]: 2025-10-04 05:54:25.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:25 np0005470441 nova_compute[192626]: 2025-10-04 05:54:25.560 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:26 np0005470441 podman[235038]: 2025-10-04 05:54:26.320198945 +0000 UTC m=+0.066609957 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.581 2 DEBUG nova.compute.manager [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-changed-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.582 2 DEBUG nova.compute.manager [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing instance network info cache due to event network-changed-a9f4a938-7289-4e23-8b45-14e8de123fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.583 2 DEBUG oslo_concurrency.lockutils [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.583 2 DEBUG oslo_concurrency.lockutils [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.583 2 DEBUG nova.network.neutron [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing network info cache for port a9f4a938-7289-4e23-8b45-14e8de123fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:26 np0005470441 nova_compute[192626]: 2025-10-04 05:54:26.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:54:28 np0005470441 nova_compute[192626]: 2025-10-04 05:54:28.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:30 np0005470441 nova_compute[192626]: 2025-10-04 05:54:30.205 2 DEBUG nova.network.neutron [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updated VIF entry in instance network info cache for port a9f4a938-7289-4e23-8b45-14e8de123fe9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:54:30 np0005470441 nova_compute[192626]: 2025-10-04 05:54:30.207 2 DEBUG nova.network.neutron [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:54:30 np0005470441 nova_compute[192626]: 2025-10-04 05:54:30.232 2 DEBUG oslo_concurrency.lockutils [req-de45710d-f371-4406-8bd4-42695d4a7c42 req-761e0465-fc64-43cc-9b4f-f28aa403cb1b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:54:31 np0005470441 nova_compute[192626]: 2025-10-04 05:54:31.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:32Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:14:da 10.100.0.10
Oct  4 01:54:32 np0005470441 ovn_controller[94840]: 2025-10-04T05:54:32Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:14:da 10.100.0.10
Oct  4 01:54:32 np0005470441 podman[235078]: 2025-10-04 05:54:32.29246398 +0000 UTC m=+0.047064834 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:54:33 np0005470441 nova_compute[192626]: 2025-10-04 05:54:33.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:34 np0005470441 podman[235103]: 2025-10-04 05:54:34.302407904 +0000 UTC m=+0.052914673 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  4 01:54:36 np0005470441 nova_compute[192626]: 2025-10-04 05:54:36.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:37 np0005470441 podman[235122]: 2025-10-04 05:54:37.438472779 +0000 UTC m=+0.189492561 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  4 01:54:38 np0005470441 nova_compute[192626]: 2025-10-04 05:54:38.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:41 np0005470441 nova_compute[192626]: 2025-10-04 05:54:41.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:43 np0005470441 nova_compute[192626]: 2025-10-04 05:54:43.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:46 np0005470441 podman[235150]: 2025-10-04 05:54:46.337626346 +0000 UTC m=+0.073867966 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  4 01:54:46 np0005470441 podman[235149]: 2025-10-04 05:54:46.342994811 +0000 UTC m=+0.086369516 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 01:54:46 np0005470441 nova_compute[192626]: 2025-10-04 05:54:46.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:48 np0005470441 nova_compute[192626]: 2025-10-04 05:54:48.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:50 np0005470441 nova_compute[192626]: 2025-10-04 05:54:50.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:50.179 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:54:50 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:50.180 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:54:50 np0005470441 podman[235196]: 2025-10-04 05:54:50.305310332 +0000 UTC m=+0.062411186 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:54:50 np0005470441 podman[235195]: 2025-10-04 05:54:50.308389231 +0000 UTC m=+0.067735730 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  4 01:54:51 np0005470441 nova_compute[192626]: 2025-10-04 05:54:51.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:53 np0005470441 nova_compute[192626]: 2025-10-04 05:54:53.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:56 np0005470441 nova_compute[192626]: 2025-10-04 05:54:56.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:54:57 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:54:57.183 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:54:57 np0005470441 podman[235231]: 2025-10-04 05:54:57.351681574 +0000 UTC m=+0.092951705 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350)
Oct  4 01:54:58 np0005470441 nova_compute[192626]: 2025-10-04 05:54:58.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:01 np0005470441 nova_compute[192626]: 2025-10-04 05:55:01.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:03 np0005470441 podman[235253]: 2025-10-04 05:55:03.292112834 +0000 UTC m=+0.049851155 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:55:03 np0005470441 nova_compute[192626]: 2025-10-04 05:55:03.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:05 np0005470441 podman[235277]: 2025-10-04 05:55:05.29929486 +0000 UTC m=+0.052686657 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:55:06 np0005470441 nova_compute[192626]: 2025-10-04 05:55:06.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:06.765 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:06.766 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:06.767 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:08 np0005470441 podman[235296]: 2025-10-04 05:55:08.334464524 +0000 UTC m=+0.088492387 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_controller, tcib_managed=true)
Oct  4 01:55:08 np0005470441 nova_compute[192626]: 2025-10-04 05:55:08.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:10 np0005470441 nova_compute[192626]: 2025-10-04 05:55:10.713 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:10 np0005470441 nova_compute[192626]: 2025-10-04 05:55:10.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:11 np0005470441 nova_compute[192626]: 2025-10-04 05:55:11.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:12 np0005470441 nova_compute[192626]: 2025-10-04 05:55:12.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:12 np0005470441 nova_compute[192626]: 2025-10-04 05:55:12.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 01:55:13 np0005470441 nova_compute[192626]: 2025-10-04 05:55:13.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:14 np0005470441 nova_compute[192626]: 2025-10-04 05:55:14.556 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:15 np0005470441 nova_compute[192626]: 2025-10-04 05:55:15.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:15 np0005470441 nova_compute[192626]: 2025-10-04 05:55:15.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 01:55:15 np0005470441 nova_compute[192626]: 2025-10-04 05:55:15.734 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 01:55:16 np0005470441 nova_compute[192626]: 2025-10-04 05:55:16.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:16 np0005470441 nova_compute[192626]: 2025-10-04 05:55:16.735 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:16 np0005470441 nova_compute[192626]: 2025-10-04 05:55:16.735 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:55:17 np0005470441 podman[235324]: 2025-10-04 05:55:17.351394769 +0000 UTC m=+0.082798942 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:55:17 np0005470441 podman[235323]: 2025-10-04 05:55:17.360989675 +0000 UTC m=+0.096139596 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:55:18 np0005470441 nova_compute[192626]: 2025-10-04 05:55:18.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:19 np0005470441 nova_compute[192626]: 2025-10-04 05:55:19.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:19 np0005470441 nova_compute[192626]: 2025-10-04 05:55:19.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:55:19 np0005470441 nova_compute[192626]: 2025-10-04 05:55:19.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:55:20 np0005470441 nova_compute[192626]: 2025-10-04 05:55:20.218 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:55:20 np0005470441 nova_compute[192626]: 2025-10-04 05:55:20.218 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:55:20 np0005470441 nova_compute[192626]: 2025-10-04 05:55:20.218 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:55:20 np0005470441 nova_compute[192626]: 2025-10-04 05:55:20.219 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 40b23805-043e-4739-93e5-5c4ad06de4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:55:21 np0005470441 podman[235380]: 2025-10-04 05:55:21.322434892 +0000 UTC m=+0.067536804 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:55:21 np0005470441 podman[235379]: 2025-10-04 05:55:21.342499019 +0000 UTC m=+0.085260873 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 01:55:21 np0005470441 nova_compute[192626]: 2025-10-04 05:55:21.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.602 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.625 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.626 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.627 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.628 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.649 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.649 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.650 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.650 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.706 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.766 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.767 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.824 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.982 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.983 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5581MB free_disk=73.39175033569336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.983 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:22 np0005470441 nova_compute[192626]: 2025-10-04 05:55:22.984 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.095 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 40b23805-043e-4739-93e5-5c4ad06de4a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.096 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.096 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.234 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.258 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.259 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.260 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.348 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:23 np0005470441 nova_compute[192626]: 2025-10-04 05:55:23.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:26 np0005470441 nova_compute[192626]: 2025-10-04 05:55:26.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:26 np0005470441 nova_compute[192626]: 2025-10-04 05:55:26.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:28 np0005470441 podman[235426]: 2025-10-04 05:55:28.297330258 +0000 UTC m=+0.053028767 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:55:28 np0005470441 nova_compute[192626]: 2025-10-04 05:55:28.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:28 np0005470441 nova_compute[192626]: 2025-10-04 05:55:28.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:31 np0005470441 nova_compute[192626]: 2025-10-04 05:55:31.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:31.595 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:55:31 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:31.597 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:55:31 np0005470441 nova_compute[192626]: 2025-10-04 05:55:31.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:33 np0005470441 nova_compute[192626]: 2025-10-04 05:55:33.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:34 np0005470441 podman[235449]: 2025-10-04 05:55:34.308297037 +0000 UTC m=+0.053579752 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:55:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:35.599 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.016 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.016 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.016 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.017 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.017 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.018 2 INFO nova.compute.manager [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Terminating instance#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.019 2 DEBUG nova.compute.manager [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:55:36 np0005470441 kernel: tapa9f4a938-72 (unregistering): left promiscuous mode
Oct  4 01:55:36 np0005470441 NetworkManager[51690]: <info>  [1759557336.0529] device (tapa9f4a938-72): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:55:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:55:36Z|00381|binding|INFO|Releasing lport a9f4a938-7289-4e23-8b45-14e8de123fe9 from this chassis (sb_readonly=0)
Oct  4 01:55:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:55:36Z|00382|binding|INFO|Setting lport a9f4a938-7289-4e23-8b45-14e8de123fe9 down in Southbound
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:55:36Z|00383|binding|INFO|Removing iface tapa9f4a938-72 ovn-installed in OVS
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 kernel: tap37fcecd9-fd (unregistering): left promiscuous mode
Oct  4 01:55:36 np0005470441 NetworkManager[51690]: <info>  [1759557336.0948] device (tap37fcecd9-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.095 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:14:da 10.100.0.10'], port_security=['fa:16:3e:b1:14:da 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '40b23805-043e-4739-93e5-5c4ad06de4a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1301a60e-039b-4195-9e51-48a134fdaa07', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abf05fc2-0c1c-4292-b3db-76f51430cc4e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a9f4a938-7289-4e23-8b45-14e8de123fe9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.097 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a9f4a938-7289-4e23-8b45-14e8de123fe9 in datapath c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 unbound from our chassis#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.098 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.100 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d038e1fb-e7cb-4624-abaa-a0ca0e37f056]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.100 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 namespace which is not needed anymore#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:55:36Z|00384|binding|INFO|Releasing lport 37fcecd9-fd14-45e1-9158-6cafb17d69ed from this chassis (sb_readonly=0)
Oct  4 01:55:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:55:36Z|00385|binding|INFO|Setting lport 37fcecd9-fd14-45e1-9158-6cafb17d69ed down in Southbound
Oct  4 01:55:36 np0005470441 ovn_controller[94840]: 2025-10-04T05:55:36Z|00386|binding|INFO|Removing iface tap37fcecd9-fd ovn-installed in OVS
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.117 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:41:2e 2001:db8::f816:3eff:fe51:412e'], port_security=['fa:16:3e:51:41:2e 2001:db8::f816:3eff:fe51:412e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe51:412e/64', 'neutron:device_id': '40b23805-043e-4739-93e5-5c4ad06de4a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1301a60e-039b-4195-9e51-48a134fdaa07', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=251ac611-cc92-44c8-b957-057bc1d81924, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=37fcecd9-fd14-45e1-9158-6cafb17d69ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 podman[235473]: 2025-10-04 05:55:36.140278103 +0000 UTC m=+0.061853740 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:55:36 np0005470441 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000032.scope: Deactivated successfully.
Oct  4 01:55:36 np0005470441 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000032.scope: Consumed 18.063s CPU time.
Oct  4 01:55:36 np0005470441 systemd-machined[152624]: Machine qemu-28-instance-00000032 terminated.
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [NOTICE]   (234905) : haproxy version is 2.8.14-c23fe91
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [NOTICE]   (234905) : path to executable is /usr/sbin/haproxy
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [WARNING]  (234905) : Exiting Master process...
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [WARNING]  (234905) : Exiting Master process...
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [ALERT]    (234905) : Current worker (234907) exited with code 143 (Terminated)
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4[234901]: [WARNING]  (234905) : All workers exited. Exiting... (0)
Oct  4 01:55:36 np0005470441 systemd[1]: libpod-90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f.scope: Deactivated successfully.
Oct  4 01:55:36 np0005470441 conmon[234901]: conmon 90a2a3204f2ad4719514 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f.scope/container/memory.events
Oct  4 01:55:36 np0005470441 podman[235519]: 2025-10-04 05:55:36.233249667 +0000 UTC m=+0.043528223 container died 90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:55:36 np0005470441 NetworkManager[51690]: <info>  [1759557336.2515] manager: (tap37fcecd9-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Oct  4 01:55:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f-userdata-shm.mount: Deactivated successfully.
Oct  4 01:55:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay-34db02ff90e711257fad657c32a678df40adf6456a22911907362e74c8d6980f-merged.mount: Deactivated successfully.
Oct  4 01:55:36 np0005470441 podman[235519]: 2025-10-04 05:55:36.281855555 +0000 UTC m=+0.092134111 container cleanup 90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:55:36 np0005470441 systemd[1]: libpod-conmon-90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f.scope: Deactivated successfully.
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.300 2 INFO nova.virt.libvirt.driver [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance destroyed successfully.#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.301 2 DEBUG nova.objects.instance [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid 40b23805-043e-4739-93e5-5c4ad06de4a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.323 2 DEBUG nova.virt.libvirt.vif [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-650600854',display_name='tempest-TestGettingAddress-server-650600854',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-650600854',id=50,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBujPvJHislz+5SzrEeq6/FdWqvF27U93knVnZePuAPZbALR7y51XFzlnmqcf+Kj+V02Lo5+hXLWlZrBE5bZHlBSuEtwiQL4a52+jJz/LdzSzyF+JCdiDN3JjvlcS2cD3g==',key_name='tempest-TestGettingAddress-1602568233',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:54:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-g69olte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:54:18Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=40b23805-043e-4739-93e5-5c4ad06de4a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.324 2 DEBUG nova.network.os_vif_util [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.325 2 DEBUG nova.network.os_vif_util [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.326 2 DEBUG os_vif [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.328 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9f4a938-72, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.339 2 INFO os_vif [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:14:da,bridge_name='br-int',has_traffic_filtering=True,id=a9f4a938-7289-4e23-8b45-14e8de123fe9,network=Network(c23d2be2-c453-4bb0-bd2f-7e68f2b62db4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa9f4a938-72')#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.340 2 DEBUG nova.virt.libvirt.vif [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:54:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-650600854',display_name='tempest-TestGettingAddress-server-650600854',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-650600854',id=50,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBujPvJHislz+5SzrEeq6/FdWqvF27U93knVnZePuAPZbALR7y51XFzlnmqcf+Kj+V02Lo5+hXLWlZrBE5bZHlBSuEtwiQL4a52+jJz/LdzSzyF+JCdiDN3JjvlcS2cD3g==',key_name='tempest-TestGettingAddress-1602568233',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:54:18Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-g69olte0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:54:18Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=40b23805-043e-4739-93e5-5c4ad06de4a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.340 2 DEBUG nova.network.os_vif_util [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.341 2 DEBUG nova.network.os_vif_util [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.342 2 DEBUG os_vif [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap37fcecd9-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 podman[235571]: 2025-10-04 05:55:36.348601395 +0000 UTC m=+0.042062760 container remove 90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3)
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.348 2 INFO os_vif [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:41:2e,bridge_name='br-int',has_traffic_filtering=True,id=37fcecd9-fd14-45e1-9158-6cafb17d69ed,network=Network(b003f74b-e8ab-45f4-bc8e-1821d74186e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap37fcecd9-fd')#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.350 2 INFO nova.virt.libvirt.driver [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Deleting instance files /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9_del#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.351 2 INFO nova.virt.libvirt.driver [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Deletion of /var/lib/nova/instances/40b23805-043e-4739-93e5-5c4ad06de4a9_del complete#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.355 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c849447d-4d31-44d6-8d4d-3188edcd6050]: (4, ('Sat Oct  4 05:55:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 (90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f)\n90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f\nSat Oct  4 05:55:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 (90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f)\n90a2a3204f2ad47195147fea79f28bd7c364f22e373410860e9c6bdbec64912f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.357 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6fca0416-df3d-4d82-a13b-0266b531185c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.357 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc23d2be2-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 kernel: tapc23d2be2-c0: left promiscuous mode
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.374 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f98228a1-85a8-4a4a-b53b-fe83f10d26b7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.409 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3c10f680-ad7b-4972-bef0-bcade35a7b5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.410 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[18d3984c-3ad7-4f8e-9e4a-4ef2d48ff041]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.414 2 DEBUG nova.compute.manager [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-changed-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.415 2 DEBUG nova.compute.manager [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing instance network info cache due to event network-changed-a9f4a938-7289-4e23-8b45-14e8de123fe9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.415 2 DEBUG oslo_concurrency.lockutils [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.415 2 DEBUG oslo_concurrency.lockutils [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.416 2 DEBUG nova.network.neutron [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Refreshing network info cache for port a9f4a938-7289-4e23-8b45-14e8de123fe9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.422 2 INFO nova.compute.manager [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.422 2 DEBUG oslo.service.loopingcall [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.423 2 DEBUG nova.compute.manager [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.423 2 DEBUG nova.network.neutron [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.427 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6279ad48-f89c-4265-9246-422c59b2e9ff]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515504, 'reachable_time': 38535, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235586, 'error': None, 'target': 'ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 systemd[1]: run-netns-ovnmeta\x2dc23d2be2\x2dc453\x2d4bb0\x2dbd2f\x2d7e68f2b62db4.mount: Deactivated successfully.
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.431 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c23d2be2-c453-4bb0-bd2f-7e68f2b62db4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.431 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[8cc0ea75-ced8-48ca-9272-e469f5018d6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.431 103689 INFO neutron.agent.ovn.metadata.agent [-] Port 37fcecd9-fd14-45e1-9158-6cafb17d69ed in datapath b003f74b-e8ab-45f4-bc8e-1821d74186e4 unbound from our chassis#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.432 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b003f74b-e8ab-45f4-bc8e-1821d74186e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.433 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[55e695b2-d514-4664-bb73-c275ec562948]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.434 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4 namespace which is not needed anymore#033[00m
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [NOTICE]   (235019) : haproxy version is 2.8.14-c23fe91
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [NOTICE]   (235019) : path to executable is /usr/sbin/haproxy
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [WARNING]  (235019) : Exiting Master process...
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [ALERT]    (235019) : Current worker (235021) exited with code 143 (Terminated)
Oct  4 01:55:36 np0005470441 neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4[235005]: [WARNING]  (235019) : All workers exited. Exiting... (0)
Oct  4 01:55:36 np0005470441 systemd[1]: libpod-6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0.scope: Deactivated successfully.
Oct  4 01:55:36 np0005470441 podman[235602]: 2025-10-04 05:55:36.555110626 +0000 UTC m=+0.047267281 container died 6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:55:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0-userdata-shm.mount: Deactivated successfully.
Oct  4 01:55:36 np0005470441 systemd[1]: var-lib-containers-storage-overlay-a13965b553bcd0c1abaa423b91c510419284efa17e8da866ba475fb1a1f39bbe-merged.mount: Deactivated successfully.
Oct  4 01:55:36 np0005470441 podman[235602]: 2025-10-04 05:55:36.588745223 +0000 UTC m=+0.080901878 container cleanup 6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  4 01:55:36 np0005470441 systemd[1]: libpod-conmon-6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0.scope: Deactivated successfully.
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.615 2 DEBUG nova.compute.manager [req-840c239f-2823-4cd5-ba9d-2b2099bfcb3e req-2a559bab-041d-4c2c-83e8-6da5a1f2cfe4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-unplugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.616 2 DEBUG oslo_concurrency.lockutils [req-840c239f-2823-4cd5-ba9d-2b2099bfcb3e req-2a559bab-041d-4c2c-83e8-6da5a1f2cfe4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.616 2 DEBUG oslo_concurrency.lockutils [req-840c239f-2823-4cd5-ba9d-2b2099bfcb3e req-2a559bab-041d-4c2c-83e8-6da5a1f2cfe4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.616 2 DEBUG oslo_concurrency.lockutils [req-840c239f-2823-4cd5-ba9d-2b2099bfcb3e req-2a559bab-041d-4c2c-83e8-6da5a1f2cfe4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.617 2 DEBUG nova.compute.manager [req-840c239f-2823-4cd5-ba9d-2b2099bfcb3e req-2a559bab-041d-4c2c-83e8-6da5a1f2cfe4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] No waiting events found dispatching network-vif-unplugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.617 2 DEBUG nova.compute.manager [req-840c239f-2823-4cd5-ba9d-2b2099bfcb3e req-2a559bab-041d-4c2c-83e8-6da5a1f2cfe4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-unplugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:55:36 np0005470441 podman[235632]: 2025-10-04 05:55:36.651814647 +0000 UTC m=+0.042696449 container remove 6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.659 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[279023cd-9fd7-4377-a707-abc66b796ee0]: (4, ('Sat Oct  4 05:55:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4 (6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0)\n6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0\nSat Oct  4 05:55:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4 (6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0)\n6a2076041a4dd1b8d943110498d4cbbad1413e3828df43ec5236d07c29cc3de0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.660 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aacb9e0a-2149-40d8-aa8c-9c72a44ee2ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.661 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb003f74b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 kernel: tapb003f74b-e0: left promiscuous mode
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.668 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bb933b17-273b-4f81-b18b-24b98be5f19c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.703 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[5958033a-7bc3-4fd6-ab4e-f59457e15f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.705 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1c0d21-ed0a-4e30-ab19-db64523e3ee5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 nova_compute[192626]: 2025-10-04 05:55:36.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.726 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7f330201-35dd-4c8a-81a4-82dc1ade0f23]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 515618, 'reachable_time': 25760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 235647, 'error': None, 'target': 'ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.728 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b003f74b-e8ab-45f4-bc8e-1821d74186e4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:55:36 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:55:36.728 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[3e9aaf34-f2be-457f-861e-538b7aa8e247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:55:37 np0005470441 systemd[1]: run-netns-ovnmeta\x2db003f74b\x2de8ab\x2d45f4\x2dbc8e\x2d1821d74186e4.mount: Deactivated successfully.
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.637 2 DEBUG nova.network.neutron [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.685 2 INFO nova.compute.manager [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Took 1.26 seconds to deallocate network for instance.#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.747 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.748 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.806 2 DEBUG nova.compute.provider_tree [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.822 2 DEBUG nova.scheduler.client.report [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.833 2 DEBUG nova.network.neutron [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updated VIF entry in instance network info cache for port a9f4a938-7289-4e23-8b45-14e8de123fe9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.833 2 DEBUG nova.network.neutron [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Updating instance_info_cache with network_info: [{"id": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "address": "fa:16:3e:b1:14:da", "network": {"id": "c23d2be2-c453-4bb0-bd2f-7e68f2b62db4", "bridge": "br-int", "label": "tempest-network-smoke--1930342407", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa9f4a938-72", "ovs_interfaceid": "a9f4a938-7289-4e23-8b45-14e8de123fe9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "address": "fa:16:3e:51:41:2e", "network": {"id": "b003f74b-e8ab-45f4-bc8e-1821d74186e4", "bridge": "br-int", "label": "tempest-network-smoke--999881934", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe51:412e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap37fcecd9-fd", "ovs_interfaceid": "37fcecd9-fd14-45e1-9158-6cafb17d69ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.843 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.854 2 DEBUG oslo_concurrency.lockutils [req-cad8f094-4bfb-48bf-a06a-aca93408f25f req-59ce13ad-f925-4190-982d-f4ef74cb4df7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-40b23805-043e-4739-93e5-5c4ad06de4a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.869 2 INFO nova.scheduler.client.report [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance 40b23805-043e-4739-93e5-5c4ad06de4a9#033[00m
Oct  4 01:55:37 np0005470441 nova_compute[192626]: 2025-10-04 05:55:37.930 2 DEBUG oslo_concurrency.lockutils [None req-9db26f4b-9851-49d4-974b-a57f732e81ff 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.515 2 DEBUG nova.compute.manager [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-unplugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.516 2 DEBUG oslo_concurrency.lockutils [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.516 2 DEBUG oslo_concurrency.lockutils [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.516 2 DEBUG oslo_concurrency.lockutils [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.517 2 DEBUG nova.compute.manager [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] No waiting events found dispatching network-vif-unplugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.517 2 WARNING nova.compute.manager [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received unexpected event network-vif-unplugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.517 2 DEBUG nova.compute.manager [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.517 2 DEBUG oslo_concurrency.lockutils [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.517 2 DEBUG oslo_concurrency.lockutils [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.518 2 DEBUG oslo_concurrency.lockutils [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.518 2 DEBUG nova.compute.manager [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] No waiting events found dispatching network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.518 2 WARNING nova.compute.manager [req-1c27f69b-b223-4d1a-b012-b2faf701fbd1 req-cec5d943-b883-481b-84c8-9175756acb81 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received unexpected event network-vif-plugged-a9f4a938-7289-4e23-8b45-14e8de123fe9 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.721 2 DEBUG nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.721 2 DEBUG oslo_concurrency.lockutils [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.722 2 DEBUG oslo_concurrency.lockutils [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.722 2 DEBUG oslo_concurrency.lockutils [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "40b23805-043e-4739-93e5-5c4ad06de4a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.722 2 DEBUG nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] No waiting events found dispatching network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.722 2 WARNING nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received unexpected event network-vif-plugged-37fcecd9-fd14-45e1-9158-6cafb17d69ed for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.723 2 DEBUG nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-deleted-a9f4a938-7289-4e23-8b45-14e8de123fe9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.723 2 INFO nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Neutron deleted interface a9f4a938-7289-4e23-8b45-14e8de123fe9; detaching it from the instance and deleting it from the info cache#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.723 2 DEBUG nova.network.neutron [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.725 2 DEBUG nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Detach interface failed, port_id=a9f4a938-7289-4e23-8b45-14e8de123fe9, reason: Instance 40b23805-043e-4739-93e5-5c4ad06de4a9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.725 2 DEBUG nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Received event network-vif-deleted-37fcecd9-fd14-45e1-9158-6cafb17d69ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.726 2 INFO nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Neutron deleted interface 37fcecd9-fd14-45e1-9158-6cafb17d69ed; detaching it from the instance and deleting it from the info cache#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.726 2 DEBUG nova.network.neutron [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.728 2 DEBUG nova.compute.manager [req-1d0c3b2a-e8d0-4b3b-8958-ff65ece041e7 req-abb5b448-e156-4de9-8bba-ae2b4adefd5b 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Detach interface failed, port_id=37fcecd9-fd14-45e1-9158-6cafb17d69ed, reason: Instance 40b23805-043e-4739-93e5-5c4ad06de4a9 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  4 01:55:38 np0005470441 nova_compute[192626]: 2025-10-04 05:55:38.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:39 np0005470441 podman[235648]: 2025-10-04 05:55:39.363553826 +0000 UTC m=+0.112143597 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  4 01:55:41 np0005470441 nova_compute[192626]: 2025-10-04 05:55:41.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:42 np0005470441 nova_compute[192626]: 2025-10-04 05:55:42.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:42 np0005470441 nova_compute[192626]: 2025-10-04 05:55:42.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:43 np0005470441 nova_compute[192626]: 2025-10-04 05:55:43.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:46 np0005470441 nova_compute[192626]: 2025-10-04 05:55:46.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:48 np0005470441 podman[235677]: 2025-10-04 05:55:48.290316846 +0000 UTC m=+0.042104951 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:55:48 np0005470441 podman[235676]: 2025-10-04 05:55:48.292438508 +0000 UTC m=+0.046552080 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  4 01:55:48 np0005470441 nova_compute[192626]: 2025-10-04 05:55:48.480 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:55:48 np0005470441 nova_compute[192626]: 2025-10-04 05:55:48.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:51 np0005470441 nova_compute[192626]: 2025-10-04 05:55:51.298 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759557336.2971957, 40b23805-043e-4739-93e5-5c4ad06de4a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:55:51 np0005470441 nova_compute[192626]: 2025-10-04 05:55:51.299 2 INFO nova.compute.manager [-] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:55:51 np0005470441 nova_compute[192626]: 2025-10-04 05:55:51.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:51 np0005470441 nova_compute[192626]: 2025-10-04 05:55:51.423 2 DEBUG nova.compute.manager [None req-56ae5b74-f7ed-4dac-bf43-8c6389521521 - - - - - -] [instance: 40b23805-043e-4739-93e5-5c4ad06de4a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:55:52 np0005470441 podman[235717]: 2025-10-04 05:55:52.318545974 +0000 UTC m=+0.069757477 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 01:55:52 np0005470441 podman[235718]: 2025-10-04 05:55:52.318580085 +0000 UTC m=+0.063749044 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_id=edpm)
Oct  4 01:55:53 np0005470441 nova_compute[192626]: 2025-10-04 05:55:53.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:56 np0005470441 nova_compute[192626]: 2025-10-04 05:55:56.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:58 np0005470441 nova_compute[192626]: 2025-10-04 05:55:58.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:55:59 np0005470441 podman[235758]: 2025-10-04 05:55:59.327442127 +0000 UTC m=+0.071376574 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, release=1755695350, maintainer=Red Hat, Inc.)
Oct  4 01:56:01 np0005470441 nova_compute[192626]: 2025-10-04 05:56:01.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.713 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:56:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:56:03 np0005470441 nova_compute[192626]: 2025-10-04 05:56:03.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:05 np0005470441 podman[235780]: 2025-10-04 05:56:05.289413816 +0000 UTC m=+0.047692982 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:56:06 np0005470441 podman[235801]: 2025-10-04 05:56:06.313397451 +0000 UTC m=+0.058990698 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:56:06 np0005470441 nova_compute[192626]: 2025-10-04 05:56:06.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:06.766 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:06.766 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:06.767 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:08 np0005470441 nova_compute[192626]: 2025-10-04 05:56:08.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:10 np0005470441 podman[235820]: 2025-10-04 05:56:10.342429301 +0000 UTC m=+0.093808819 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  4 01:56:10 np0005470441 nova_compute[192626]: 2025-10-04 05:56:10.908 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:11 np0005470441 nova_compute[192626]: 2025-10-04 05:56:11.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:11 np0005470441 nova_compute[192626]: 2025-10-04 05:56:11.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:13 np0005470441 nova_compute[192626]: 2025-10-04 05:56:13.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:16 np0005470441 nova_compute[192626]: 2025-10-04 05:56:16.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:16 np0005470441 nova_compute[192626]: 2025-10-04 05:56:16.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:16 np0005470441 nova_compute[192626]: 2025-10-04 05:56:16.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:56:18 np0005470441 nova_compute[192626]: 2025-10-04 05:56:18.969 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:19 np0005470441 podman[235847]: 2025-10-04 05:56:19.303249219 +0000 UTC m=+0.055005943 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:56:19 np0005470441 podman[235846]: 2025-10-04 05:56:19.31025193 +0000 UTC m=+0.062821138 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:56:19 np0005470441 nova_compute[192626]: 2025-10-04 05:56:19.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:20 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:20.324 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:e2:a8 10.100.0.2 2001:db8::f816:3eff:fe82:e2a8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe82:e2a8/64', 'neutron:device_id': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9de0a55a-1728-4d43-9726-17f22be2dd80, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4daefd5d-41c8-41fc-8c57-5cfdc9027f69) old=Port_Binding(mac=['fa:16:3e:82:e2:a8 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:56:20 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:20.325 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4daefd5d-41c8-41fc-8c57-5cfdc9027f69 in datapath e938f61e-30e4-4d66-b8cd-e55f04207f2a updated#033[00m
Oct  4 01:56:20 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:20.326 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e938f61e-30e4-4d66-b8cd-e55f04207f2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:56:20 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:20.327 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[334434b8-99b8-4e7b-ac3e-d8fea0e00315]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:21 np0005470441 nova_compute[192626]: 2025-10-04 05:56:21.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:21 np0005470441 nova_compute[192626]: 2025-10-04 05:56:21.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:21 np0005470441 nova_compute[192626]: 2025-10-04 05:56:21.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:56:21 np0005470441 nova_compute[192626]: 2025-10-04 05:56:21.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:56:21 np0005470441 nova_compute[192626]: 2025-10-04 05:56:21.774 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.749 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.750 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.750 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.750 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:56:22 np0005470441 podman[235890]: 2025-10-04 05:56:22.846733004 +0000 UTC m=+0.054355095 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  4 01:56:22 np0005470441 podman[235891]: 2025-10-04 05:56:22.849238136 +0000 UTC m=+0.054422597 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.925 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.926 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5762MB free_disk=73.4200668334961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.926 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.927 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.980 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:56:22 np0005470441 nova_compute[192626]: 2025-10-04 05:56:22.980 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:56:23 np0005470441 nova_compute[192626]: 2025-10-04 05:56:23.002 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:56:23 np0005470441 nova_compute[192626]: 2025-10-04 05:56:23.019 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:56:23 np0005470441 nova_compute[192626]: 2025-10-04 05:56:23.041 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:56:23 np0005470441 nova_compute[192626]: 2025-10-04 05:56:23.041 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:23 np0005470441 nova_compute[192626]: 2025-10-04 05:56:23.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:25.071 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:e2:a8 10.100.0.2 2001:db8:0:1:f816:3eff:fe82:e2a8 2001:db8::f816:3eff:fe82:e2a8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe82:e2a8/64 2001:db8::f816:3eff:fe82:e2a8/64', 'neutron:device_id': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9de0a55a-1728-4d43-9726-17f22be2dd80, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=4daefd5d-41c8-41fc-8c57-5cfdc9027f69) old=Port_Binding(mac=['fa:16:3e:82:e2:a8 10.100.0.2 2001:db8::f816:3eff:fe82:e2a8'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe82:e2a8/64', 'neutron:device_id': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:56:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:25.073 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 4daefd5d-41c8-41fc-8c57-5cfdc9027f69 in datapath e938f61e-30e4-4d66-b8cd-e55f04207f2a updated#033[00m
Oct  4 01:56:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:25.073 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e938f61e-30e4-4d66-b8cd-e55f04207f2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:56:25 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:25.074 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1f115890-cd3c-440c-857d-ad34cea81b49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:26 np0005470441 nova_compute[192626]: 2025-10-04 05:56:26.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:28 np0005470441 nova_compute[192626]: 2025-10-04 05:56:28.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:29 np0005470441 nova_compute[192626]: 2025-10-04 05:56:29.041 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:29 np0005470441 nova_compute[192626]: 2025-10-04 05:56:29.042 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:56:30 np0005470441 podman[235927]: 2025-10-04 05:56:30.35776021 +0000 UTC m=+0.091963497 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.486 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.486 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.502 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.571 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.572 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.578 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.579 2 INFO nova.compute.claims [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.679 2 DEBUG nova.compute.provider_tree [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.694 2 DEBUG nova.scheduler.client.report [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.716 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.716 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.774 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.775 2 DEBUG nova.network.neutron [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.823 2 INFO nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.870 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.946 2 DEBUG nova.policy [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.974 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.975 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.976 2 INFO nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Creating image(s)#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.976 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.976 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.977 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:30 np0005470441 nova_compute[192626]: 2025-10-04 05:56:30.989 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.064 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.065 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.066 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.077 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.131 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.132 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.168 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.169 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.169 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.240 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.242 2 DEBUG nova.virt.disk.api [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.242 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.305 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.307 2 DEBUG nova.virt.disk.api [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.308 2 DEBUG nova.objects.instance [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid 753b29af-9c02-4eb4-a936-6c2a29a1ca6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.343 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.344 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Ensure instance console log exists: /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.345 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.345 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.346 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:31 np0005470441 nova_compute[192626]: 2025-10-04 05:56:31.843 2 DEBUG nova.network.neutron [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Successfully created port: b75ffdcd-dd04-4543-8be3-49803a84b890 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.712 2 DEBUG nova.network.neutron [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Successfully updated port: b75ffdcd-dd04-4543-8be3-49803a84b890 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.738 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.738 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.739 2 DEBUG nova.network.neutron [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.822 2 DEBUG nova.compute.manager [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-changed-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.822 2 DEBUG nova.compute.manager [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Refreshing instance network info cache due to event network-changed-b75ffdcd-dd04-4543-8be3-49803a84b890. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.823 2 DEBUG oslo_concurrency.lockutils [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:56:32 np0005470441 nova_compute[192626]: 2025-10-04 05:56:32.917 2 DEBUG nova.network.neutron [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:56:33 np0005470441 nova_compute[192626]: 2025-10-04 05:56:33.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.024 2 DEBUG nova.network.neutron [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updating instance_info_cache with network_info: [{"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.050 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.050 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Instance network_info: |[{"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.051 2 DEBUG oslo_concurrency.lockutils [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.051 2 DEBUG nova.network.neutron [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Refreshing network info cache for port b75ffdcd-dd04-4543-8be3-49803a84b890 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.054 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Start _get_guest_xml network_info=[{"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.057 2 WARNING nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.061 2 DEBUG nova.virt.libvirt.host [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.062 2 DEBUG nova.virt.libvirt.host [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.066 2 DEBUG nova.virt.libvirt.host [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.067 2 DEBUG nova.virt.libvirt.host [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.068 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.068 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.068 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.069 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.069 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.069 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.069 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.070 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.070 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.070 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.070 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.071 2 DEBUG nova.virt.hardware [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.074 2 DEBUG nova.virt.libvirt.vif [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-588804165',display_name='tempest-TestGettingAddress-server-588804165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-588804165',id=52,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdhVl89e0BeyqDsYM0oDOcVwrB3PmrERcpLbjIle7yvpHDZc+YrNQpXy4dmNQkUqV1/F3L1PX0UF/fii5DLVhqbB4jPtiuHHjnIcR8APf+STrmV2Yc5NdQU2y1yow77WQ==',key_name='tempest-TestGettingAddress-212026097',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-i6b5qbx0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:56:30Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=753b29af-9c02-4eb4-a936-6c2a29a1ca6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.074 2 DEBUG nova.network.os_vif_util [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.075 2 DEBUG nova.network.os_vif_util [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.076 2 DEBUG nova.objects.instance [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid 753b29af-9c02-4eb4-a936-6c2a29a1ca6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.089 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <uuid>753b29af-9c02-4eb4-a936-6c2a29a1ca6b</uuid>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <name>instance-00000034</name>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-588804165</nova:name>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:56:34</nova:creationTime>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        <nova:port uuid="b75ffdcd-dd04-4543-8be3-49803a84b890">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe45:2c0a" ipVersion="6"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe45:2c0a" ipVersion="6"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <entry name="serial">753b29af-9c02-4eb4-a936-6c2a29a1ca6b</entry>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <entry name="uuid">753b29af-9c02-4eb4-a936-6c2a29a1ca6b</entry>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.config"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:45:2c:0a"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <target dev="tapb75ffdcd-dd"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/console.log" append="off"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:56:34 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:56:34 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:56:34 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:56:34 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.090 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Preparing to wait for external event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.090 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.091 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.091 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.092 2 DEBUG nova.virt.libvirt.vif [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-588804165',display_name='tempest-TestGettingAddress-server-588804165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-588804165',id=52,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdhVl89e0BeyqDsYM0oDOcVwrB3PmrERcpLbjIle7yvpHDZc+YrNQpXy4dmNQkUqV1/F3L1PX0UF/fii5DLVhqbB4jPtiuHHjnIcR8APf+STrmV2Yc5NdQU2y1yow77WQ==',key_name='tempest-TestGettingAddress-212026097',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-i6b5qbx0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:56:30Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=753b29af-9c02-4eb4-a936-6c2a29a1ca6b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.092 2 DEBUG nova.network.os_vif_util [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.093 2 DEBUG nova.network.os_vif_util [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.093 2 DEBUG os_vif [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.094 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.097 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb75ffdcd-dd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.097 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb75ffdcd-dd, col_values=(('external_ids', {'iface-id': 'b75ffdcd-dd04-4543-8be3-49803a84b890', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:2c:0a', 'vm-uuid': '753b29af-9c02-4eb4-a936-6c2a29a1ca6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:34 np0005470441 NetworkManager[51690]: <info>  [1759557394.1003] manager: (tapb75ffdcd-dd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/190)
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.105 2 INFO os_vif [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd')#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.149 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.149 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.149 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:45:2c:0a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.150 2 INFO nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Using config drive#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.510 2 INFO nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Creating config drive at /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.config#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.516 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp30i4d8ql execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.644 2 DEBUG oslo_concurrency.processutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp30i4d8ql" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:56:34 np0005470441 NetworkManager[51690]: <info>  [1759557394.7187] manager: (tapb75ffdcd-dd): new Tun device (/org/freedesktop/NetworkManager/Devices/191)
Oct  4 01:56:34 np0005470441 kernel: tapb75ffdcd-dd: entered promiscuous mode
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.721 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:34Z|00387|binding|INFO|Claiming lport b75ffdcd-dd04-4543-8be3-49803a84b890 for this chassis.
Oct  4 01:56:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:34Z|00388|binding|INFO|b75ffdcd-dd04-4543-8be3-49803a84b890: Claiming fa:16:3e:45:2c:0a 10.100.0.11 2001:db8:0:1:f816:3eff:fe45:2c0a 2001:db8::f816:3eff:fe45:2c0a
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.745 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:2c:0a 10.100.0.11 2001:db8:0:1:f816:3eff:fe45:2c0a 2001:db8::f816:3eff:fe45:2c0a'], port_security=['fa:16:3e:45:2c:0a 10.100.0.11 2001:db8:0:1:f816:3eff:fe45:2c0a 2001:db8::f816:3eff:fe45:2c0a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fe45:2c0a/64 2001:db8::f816:3eff:fe45:2c0a/64', 'neutron:device_id': '753b29af-9c02-4eb4-a936-6c2a29a1ca6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be1f8e11-80cc-4d9b-a21f-197709e7fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9de0a55a-1728-4d43-9726-17f22be2dd80, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=b75ffdcd-dd04-4543-8be3-49803a84b890) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.746 103689 INFO neutron.agent.ovn.metadata.agent [-] Port b75ffdcd-dd04-4543-8be3-49803a84b890 in datapath e938f61e-30e4-4d66-b8cd-e55f04207f2a bound to our chassis#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.747 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e938f61e-30e4-4d66-b8cd-e55f04207f2a#033[00m
Oct  4 01:56:34 np0005470441 systemd-udevd[235981]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:56:34 np0005470441 systemd-machined[152624]: New machine qemu-29-instance-00000034.
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.756 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9c7dc5-033e-4da7-b00c-a167720efd14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.757 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape938f61e-31 in ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.759 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape938f61e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.759 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aa774fa0-e16b-4150-9b28-3e514e1005a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.759 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b64461-c43a-4e51-b099-094c2c63c488]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 NetworkManager[51690]: <info>  [1759557394.7643] device (tapb75ffdcd-dd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:56:34 np0005470441 NetworkManager[51690]: <info>  [1759557394.7654] device (tapb75ffdcd-dd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.771 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[b654e07e-09c9-4abc-b183-1a271f62edcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 systemd[1]: Started Virtual Machine qemu-29-instance-00000034.
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:34Z|00389|binding|INFO|Setting lport b75ffdcd-dd04-4543-8be3-49803a84b890 ovn-installed in OVS
Oct  4 01:56:34 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:34Z|00390|binding|INFO|Setting lport b75ffdcd-dd04-4543-8be3-49803a84b890 up in Southbound
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.794 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[be2bbd78-b261-4272-ba21-be49a8b93f2f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.825 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[8c71d2a1-187e-4e08-8dbb-b9790a2e38bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 NetworkManager[51690]: <info>  [1759557394.8309] manager: (tape938f61e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/192)
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.830 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[89a0c145-66da-467d-bdaf-473686fbc689]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 systemd-udevd[235986]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.860 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[9e2f996c-37db-4f1f-af6e-814208cac033]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.864 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[b13b0859-8715-429b-adad-2e415ddb46da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 NetworkManager[51690]: <info>  [1759557394.8905] device (tape938f61e-30): carrier: link connected
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.901 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[e6021988-41d4-4989-8a50-102b365a2278]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.924 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7aa2ee9f-6a8d-4f1c-9415-ed5a7619dbf0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape938f61e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:e2:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529254, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236015, 'error': None, 'target': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.951 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[79b77f49-0630-4c3d-83ed-60057ae21258]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe82:e2a8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 529254, 'tstamp': 529254}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236016, 'error': None, 'target': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:34.973 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[29e5a9e8-3594-4eab-8e12-13c75f8ef312]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape938f61e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:82:e2:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 306, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 306, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 109], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529254, 'reachable_time': 26822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 264, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 264, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236017, 'error': None, 'target': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.997 2 DEBUG nova.compute.manager [req-ffc61015-b9e6-4029-8e37-46b630c2574c req-3506d3ef-b7e9-4870-b1bd-654c104edcde 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.998 2 DEBUG oslo_concurrency.lockutils [req-ffc61015-b9e6-4029-8e37-46b630c2574c req-3506d3ef-b7e9-4870-b1bd-654c104edcde 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:34 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.999 2 DEBUG oslo_concurrency.lockutils [req-ffc61015-b9e6-4029-8e37-46b630c2574c req-3506d3ef-b7e9-4870-b1bd-654c104edcde 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:34.999 2 DEBUG oslo_concurrency.lockutils [req-ffc61015-b9e6-4029-8e37-46b630c2574c req-3506d3ef-b7e9-4870-b1bd-654c104edcde 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.000 2 DEBUG nova.compute.manager [req-ffc61015-b9e6-4029-8e37-46b630c2574c req-3506d3ef-b7e9-4870-b1bd-654c104edcde 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Processing event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.014 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f4df0b33-6e4c-40a7-8989-435f471faab7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.080 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[360980d7-ccb3-4b6c-a272-64547945aa82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.081 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape938f61e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.081 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.082 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape938f61e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:35 np0005470441 kernel: tape938f61e-30: entered promiscuous mode
Oct  4 01:56:35 np0005470441 NetworkManager[51690]: <info>  [1759557395.0844] manager: (tape938f61e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.087 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape938f61e-30, col_values=(('external_ids', {'iface-id': '4daefd5d-41c8-41fc-8c57-5cfdc9027f69'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:35 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:35Z|00391|binding|INFO|Releasing lport 4daefd5d-41c8-41fc-8c57-5cfdc9027f69 from this chassis (sb_readonly=0)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.090 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e938f61e-30e4-4d66-b8cd-e55f04207f2a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e938f61e-30e4-4d66-b8cd-e55f04207f2a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.091 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3b31df2d-ed51-4007-91c1-712930e59898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.092 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-e938f61e-30e4-4d66-b8cd-e55f04207f2a
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/e938f61e-30e4-4d66-b8cd-e55f04207f2a.pid.haproxy
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID e938f61e-30e4-4d66-b8cd-e55f04207f2a
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.093 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'env', 'PROCESS_TAG=haproxy-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e938f61e-30e4-4d66-b8cd-e55f04207f2a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.159 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:35 np0005470441 podman[236056]: 2025-10-04 05:56:35.483421605 +0000 UTC m=+0.062462578 container create 2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.502 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557395.502119, 753b29af-9c02-4eb4-a936-6c2a29a1ca6b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.503 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] VM Started (Lifecycle Event)#033[00m
Oct  4 01:56:35 np0005470441 systemd[1]: Started libpod-conmon-2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666.scope.
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.505 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.510 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.514 2 INFO nova.virt.libvirt.driver [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Instance spawned successfully.#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.514 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:56:35 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:56:35 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9f7e142c0bd4b19cda7ecd83b45644106629b8d11b597cfb565097d20ab56060/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.530 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:56:35 np0005470441 podman[236056]: 2025-10-04 05:56:35.536688687 +0000 UTC m=+0.115729660 container init 2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.539 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.543 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.543 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.543 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:56:35 np0005470441 podman[236056]: 2025-10-04 05:56:35.544128781 +0000 UTC m=+0.123169754 container start 2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.544 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.544 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.544 2 DEBUG nova.virt.libvirt.driver [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:56:35 np0005470441 podman[236056]: 2025-10-04 05:56:35.451678152 +0000 UTC m=+0.030719165 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:56:35 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [NOTICE]   (236085) : New worker (236087) forked
Oct  4 01:56:35 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [NOTICE]   (236085) : Loading success.
Oct  4 01:56:35 np0005470441 podman[236069]: 2025-10-04 05:56:35.588439684 +0000 UTC m=+0.070152947 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.593 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.593 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557395.5023334, 753b29af-9c02-4eb4-a936-6c2a29a1ca6b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.593 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:56:35 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:35.594 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.617 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.621 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557395.5101087, 753b29af-9c02-4eb4-a936-6c2a29a1ca6b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.621 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.626 2 INFO nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Took 4.65 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.626 2 DEBUG nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.657 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.660 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.696 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.719 2 INFO nova.compute.manager [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Took 5.17 seconds to build instance.#033[00m
Oct  4 01:56:35 np0005470441 nova_compute[192626]: 2025-10-04 05:56:35.744 2 DEBUG oslo_concurrency.lockutils [None req-d771e52f-c6d0-4449-b6f6-7602fa38f90a 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.151 2 DEBUG nova.compute.manager [req-69466a7b-2e5b-41d1-86e8-efb8ad845495 req-47c0c481-66e6-4a83-8190-74ff24262bbc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.151 2 DEBUG oslo_concurrency.lockutils [req-69466a7b-2e5b-41d1-86e8-efb8ad845495 req-47c0c481-66e6-4a83-8190-74ff24262bbc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.152 2 DEBUG oslo_concurrency.lockutils [req-69466a7b-2e5b-41d1-86e8-efb8ad845495 req-47c0c481-66e6-4a83-8190-74ff24262bbc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.152 2 DEBUG oslo_concurrency.lockutils [req-69466a7b-2e5b-41d1-86e8-efb8ad845495 req-47c0c481-66e6-4a83-8190-74ff24262bbc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.152 2 DEBUG nova.compute.manager [req-69466a7b-2e5b-41d1-86e8-efb8ad845495 req-47c0c481-66e6-4a83-8190-74ff24262bbc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] No waiting events found dispatching network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.152 2 WARNING nova.compute.manager [req-69466a7b-2e5b-41d1-86e8-efb8ad845495 req-47c0c481-66e6-4a83-8190-74ff24262bbc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received unexpected event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 for instance with vm_state active and task_state None.#033[00m
Oct  4 01:56:37 np0005470441 podman[236107]: 2025-10-04 05:56:37.312359622 +0000 UTC m=+0.060722378 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.443 2 DEBUG nova.network.neutron [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updated VIF entry in instance network info cache for port b75ffdcd-dd04-4543-8be3-49803a84b890. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.444 2 DEBUG nova.network.neutron [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updating instance_info_cache with network_info: [{"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:56:37 np0005470441 nova_compute[192626]: 2025-10-04 05:56:37.500 2 DEBUG oslo_concurrency.lockutils [req-40f241be-03d5-4f98-ba93-b83c17bf3a01 req-621a1cfb-dfcc-4943-aae4-201fc38b5f32 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:56:38 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:56:38.596 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:56:38 np0005470441 nova_compute[192626]: 2025-10-04 05:56:38.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:39 np0005470441 nova_compute[192626]: 2025-10-04 05:56:39.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:41 np0005470441 podman[236127]: 2025-10-04 05:56:41.383709109 +0000 UTC m=+0.138078392 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  4 01:56:43 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:43Z|00392|binding|INFO|Releasing lport 4daefd5d-41c8-41fc-8c57-5cfdc9027f69 from this chassis (sb_readonly=0)
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:43 np0005470441 NetworkManager[51690]: <info>  [1759557403.1194] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Oct  4 01:56:43 np0005470441 NetworkManager[51690]: <info>  [1759557403.1208] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/195)
Oct  4 01:56:43 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:43Z|00393|binding|INFO|Releasing lport 4daefd5d-41c8-41fc-8c57-5cfdc9027f69 from this chassis (sb_readonly=0)
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.728 2 DEBUG nova.compute.manager [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-changed-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.729 2 DEBUG nova.compute.manager [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Refreshing instance network info cache due to event network-changed-b75ffdcd-dd04-4543-8be3-49803a84b890. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.729 2 DEBUG oslo_concurrency.lockutils [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.729 2 DEBUG oslo_concurrency.lockutils [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.730 2 DEBUG nova.network.neutron [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Refreshing network info cache for port b75ffdcd-dd04-4543-8be3-49803a84b890 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:56:43 np0005470441 nova_compute[192626]: 2025-10-04 05:56:43.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:44 np0005470441 nova_compute[192626]: 2025-10-04 05:56:44.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:46 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:46Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:2c:0a 10.100.0.11
Oct  4 01:56:46 np0005470441 ovn_controller[94840]: 2025-10-04T05:56:46Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:2c:0a 10.100.0.11
Oct  4 01:56:48 np0005470441 nova_compute[192626]: 2025-10-04 05:56:48.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:49 np0005470441 nova_compute[192626]: 2025-10-04 05:56:49.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:49 np0005470441 nova_compute[192626]: 2025-10-04 05:56:49.435 2 DEBUG nova.network.neutron [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updated VIF entry in instance network info cache for port b75ffdcd-dd04-4543-8be3-49803a84b890. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:56:49 np0005470441 nova_compute[192626]: 2025-10-04 05:56:49.436 2 DEBUG nova.network.neutron [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updating instance_info_cache with network_info: [{"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:56:49 np0005470441 nova_compute[192626]: 2025-10-04 05:56:49.490 2 DEBUG oslo_concurrency.lockutils [req-0ad25ada-e142-4262-9a2f-e61b0381903e req-4f55e876-2c59-49ba-b2a2-e37281f46590 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:56:50 np0005470441 podman[236166]: 2025-10-04 05:56:50.311393504 +0000 UTC m=+0.056435624 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  4 01:56:50 np0005470441 podman[236167]: 2025-10-04 05:56:50.320209538 +0000 UTC m=+0.053163210 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 01:56:53 np0005470441 podman[236211]: 2025-10-04 05:56:53.307655409 +0000 UTC m=+0.057678001 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:56:53 np0005470441 podman[236212]: 2025-10-04 05:56:53.318405818 +0000 UTC m=+0.063028534 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0)
Oct  4 01:56:53 np0005470441 nova_compute[192626]: 2025-10-04 05:56:53.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:54 np0005470441 nova_compute[192626]: 2025-10-04 05:56:54.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:58 np0005470441 nova_compute[192626]: 2025-10-04 05:56:58.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:56:59 np0005470441 nova_compute[192626]: 2025-10-04 05:56:59.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:01 np0005470441 podman[236252]: 2025-10-04 05:57:01.302740528 +0000 UTC m=+0.059070940 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 01:57:03 np0005470441 nova_compute[192626]: 2025-10-04 05:57:03.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:04 np0005470441 nova_compute[192626]: 2025-10-04 05:57:04.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:06 np0005470441 podman[236272]: 2025-10-04 05:57:06.298904768 +0000 UTC m=+0.058139153 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:57:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:06.767 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:06.767 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:06.768 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:08 np0005470441 podman[236296]: 2025-10-04 05:57:08.299410399 +0000 UTC m=+0.050852074 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  4 01:57:09 np0005470441 nova_compute[192626]: 2025-10-04 05:57:09.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:09 np0005470441 nova_compute[192626]: 2025-10-04 05:57:09.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:10 np0005470441 nova_compute[192626]: 2025-10-04 05:57:10.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:11 np0005470441 nova_compute[192626]: 2025-10-04 05:57:11.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:12 np0005470441 podman[236315]: 2025-10-04 05:57:12.347695894 +0000 UTC m=+0.098864595 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  4 01:57:14 np0005470441 nova_compute[192626]: 2025-10-04 05:57:14.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:14 np0005470441 nova_compute[192626]: 2025-10-04 05:57:14.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:17 np0005470441 nova_compute[192626]: 2025-10-04 05:57:17.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:17 np0005470441 nova_compute[192626]: 2025-10-04 05:57:17.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:57:19 np0005470441 nova_compute[192626]: 2025-10-04 05:57:19.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:19 np0005470441 nova_compute[192626]: 2025-10-04 05:57:19.114 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:19 np0005470441 nova_compute[192626]: 2025-10-04 05:57:19.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:21 np0005470441 podman[236342]: 2025-10-04 05:57:21.333111931 +0000 UTC m=+0.071457977 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:57:21 np0005470441 podman[236341]: 2025-10-04 05:57:21.335122239 +0000 UTC m=+0.075825012 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Oct  4 01:57:23 np0005470441 nova_compute[192626]: 2025-10-04 05:57:23.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:23 np0005470441 nova_compute[192626]: 2025-10-04 05:57:23.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:57:23 np0005470441 nova_compute[192626]: 2025-10-04 05:57:23.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:57:24 np0005470441 nova_compute[192626]: 2025-10-04 05:57:24.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:24 np0005470441 nova_compute[192626]: 2025-10-04 05:57:24.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:24 np0005470441 nova_compute[192626]: 2025-10-04 05:57:24.265 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:57:24 np0005470441 nova_compute[192626]: 2025-10-04 05:57:24.265 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:57:24 np0005470441 nova_compute[192626]: 2025-10-04 05:57:24.265 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:57:24 np0005470441 nova_compute[192626]: 2025-10-04 05:57:24.265 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 753b29af-9c02-4eb4-a936-6c2a29a1ca6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:57:24 np0005470441 podman[236384]: 2025-10-04 05:57:24.297799418 +0000 UTC m=+0.051104931 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 01:57:24 np0005470441 podman[236385]: 2025-10-04 05:57:24.306192819 +0000 UTC m=+0.055396634 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.605 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updating instance_info_cache with network_info: [{"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.622 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.622 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.623 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.624 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.652 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.652 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.653 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.653 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.745 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.813 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.814 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:57:26 np0005470441 nova_compute[192626]: 2025-10-04 05:57:26.891 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.076 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.077 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5577MB free_disk=73.38838195800781GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.077 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.077 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.155 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance 753b29af-9c02-4eb4-a936-6c2a29a1ca6b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.155 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.155 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.171 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.185 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.185 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.199 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.220 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.264 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.283 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.307 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:57:27 np0005470441 nova_compute[192626]: 2025-10-04 05:57:27.307 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:29 np0005470441 nova_compute[192626]: 2025-10-04 05:57:29.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:29 np0005470441 nova_compute[192626]: 2025-10-04 05:57:29.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:29 np0005470441 nova_compute[192626]: 2025-10-04 05:57:29.400 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:29 np0005470441 nova_compute[192626]: 2025-10-04 05:57:29.418 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:29 np0005470441 nova_compute[192626]: 2025-10-04 05:57:29.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:57:32 np0005470441 podman[236431]: 2025-10-04 05:57:32.304619336 +0000 UTC m=+0.061970593 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, version=9.6, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  4 01:57:34 np0005470441 nova_compute[192626]: 2025-10-04 05:57:34.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:34 np0005470441 nova_compute[192626]: 2025-10-04 05:57:34.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:37 np0005470441 podman[236452]: 2025-10-04 05:57:37.316212279 +0000 UTC m=+0.066300088 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:57:39 np0005470441 nova_compute[192626]: 2025-10-04 05:57:39.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:39 np0005470441 nova_compute[192626]: 2025-10-04 05:57:39.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:39 np0005470441 podman[236477]: 2025-10-04 05:57:39.309086162 +0000 UTC m=+0.062005974 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:57:40 np0005470441 ovn_controller[94840]: 2025-10-04T05:57:40Z|00394|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Oct  4 01:57:43 np0005470441 podman[236499]: 2025-10-04 05:57:43.355929636 +0000 UTC m=+0.107904734 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Oct  4 01:57:44 np0005470441 nova_compute[192626]: 2025-10-04 05:57:44.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:44 np0005470441 nova_compute[192626]: 2025-10-04 05:57:44.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:44.517 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:57:44 np0005470441 nova_compute[192626]: 2025-10-04 05:57:44.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:44.518 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:57:44 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:44.519 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.022 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.023 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.024 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.024 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.025 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.027 2 INFO nova.compute.manager [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Terminating instance#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.028 2 DEBUG nova.compute.manager [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:57:48 np0005470441 kernel: tapb75ffdcd-dd (unregistering): left promiscuous mode
Oct  4 01:57:48 np0005470441 NetworkManager[51690]: <info>  [1759557468.0498] device (tapb75ffdcd-dd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:57:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:57:48Z|00395|binding|INFO|Releasing lport b75ffdcd-dd04-4543-8be3-49803a84b890 from this chassis (sb_readonly=0)
Oct  4 01:57:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:57:48Z|00396|binding|INFO|Setting lport b75ffdcd-dd04-4543-8be3-49803a84b890 down in Southbound
Oct  4 01:57:48 np0005470441 ovn_controller[94840]: 2025-10-04T05:57:48Z|00397|binding|INFO|Removing iface tapb75ffdcd-dd ovn-installed in OVS
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.075 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:2c:0a 10.100.0.11 2001:db8:0:1:f816:3eff:fe45:2c0a 2001:db8::f816:3eff:fe45:2c0a'], port_security=['fa:16:3e:45:2c:0a 10.100.0.11 2001:db8:0:1:f816:3eff:fe45:2c0a 2001:db8::f816:3eff:fe45:2c0a'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fe45:2c0a/64 2001:db8::f816:3eff:fe45:2c0a/64', 'neutron:device_id': '753b29af-9c02-4eb4-a936-6c2a29a1ca6b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be1f8e11-80cc-4d9b-a21f-197709e7fcfe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9de0a55a-1728-4d43-9726-17f22be2dd80, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=b75ffdcd-dd04-4543-8be3-49803a84b890) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.076 103689 INFO neutron.agent.ovn.metadata.agent [-] Port b75ffdcd-dd04-4543-8be3-49803a84b890 in datapath e938f61e-30e4-4d66-b8cd-e55f04207f2a unbound from our chassis#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.077 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e938f61e-30e4-4d66-b8cd-e55f04207f2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.080 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e0d179-2d21-4561-8a3c-5dfd5e868b3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.080 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a namespace which is not needed anymore#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:48 np0005470441 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000034.scope: Deactivated successfully.
Oct  4 01:57:48 np0005470441 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000034.scope: Consumed 14.938s CPU time.
Oct  4 01:57:48 np0005470441 systemd-machined[152624]: Machine qemu-29-instance-00000034 terminated.
Oct  4 01:57:48 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [NOTICE]   (236085) : haproxy version is 2.8.14-c23fe91
Oct  4 01:57:48 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [NOTICE]   (236085) : path to executable is /usr/sbin/haproxy
Oct  4 01:57:48 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [WARNING]  (236085) : Exiting Master process...
Oct  4 01:57:48 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [ALERT]    (236085) : Current worker (236087) exited with code 143 (Terminated)
Oct  4 01:57:48 np0005470441 neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a[236072]: [WARNING]  (236085) : All workers exited. Exiting... (0)
Oct  4 01:57:48 np0005470441 systemd[1]: libpod-2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666.scope: Deactivated successfully.
Oct  4 01:57:48 np0005470441 podman[236549]: 2025-10-04 05:57:48.252236904 +0000 UTC m=+0.059136802 container died 2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 01:57:48 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666-userdata-shm.mount: Deactivated successfully.
Oct  4 01:57:48 np0005470441 systemd[1]: var-lib-containers-storage-overlay-9f7e142c0bd4b19cda7ecd83b45644106629b8d11b597cfb565097d20ab56060-merged.mount: Deactivated successfully.
Oct  4 01:57:48 np0005470441 podman[236549]: 2025-10-04 05:57:48.297708622 +0000 UTC m=+0.104608520 container cleanup 2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.297 2 INFO nova.virt.libvirt.driver [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Instance destroyed successfully.#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.298 2 DEBUG nova.objects.instance [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid 753b29af-9c02-4eb4-a936-6c2a29a1ca6b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:57:48 np0005470441 systemd[1]: libpod-conmon-2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666.scope: Deactivated successfully.
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.323 2 DEBUG nova.virt.libvirt.vif [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:56:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-588804165',display_name='tempest-TestGettingAddress-server-588804165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-588804165',id=52,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPdhVl89e0BeyqDsYM0oDOcVwrB3PmrERcpLbjIle7yvpHDZc+YrNQpXy4dmNQkUqV1/F3L1PX0UF/fii5DLVhqbB4jPtiuHHjnIcR8APf+STrmV2Yc5NdQU2y1yow77WQ==',key_name='tempest-TestGettingAddress-212026097',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:56:35Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-i6b5qbx0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:56:35Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=753b29af-9c02-4eb4-a936-6c2a29a1ca6b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.324 2 DEBUG nova.network.os_vif_util [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "b75ffdcd-dd04-4543-8be3-49803a84b890", "address": "fa:16:3e:45:2c:0a", "network": {"id": "e938f61e-30e4-4d66-b8cd-e55f04207f2a", "bridge": "br-int", "label": "tempest-network-smoke--1123518369", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe45:2c0a", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb75ffdcd-dd", "ovs_interfaceid": "b75ffdcd-dd04-4543-8be3-49803a84b890", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.325 2 DEBUG nova.network.os_vif_util [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.325 2 DEBUG os_vif [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.327 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb75ffdcd-dd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.333 2 INFO os_vif [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:45:2c:0a,bridge_name='br-int',has_traffic_filtering=True,id=b75ffdcd-dd04-4543-8be3-49803a84b890,network=Network(e938f61e-30e4-4d66-b8cd-e55f04207f2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb75ffdcd-dd')#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.333 2 INFO nova.virt.libvirt.driver [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Deleting instance files /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b_del#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.334 2 INFO nova.virt.libvirt.driver [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Deletion of /var/lib/nova/instances/753b29af-9c02-4eb4-a936-6c2a29a1ca6b_del complete#033[00m
Oct  4 01:57:48 np0005470441 podman[236595]: 2025-10-04 05:57:48.363057912 +0000 UTC m=+0.047309032 container remove 2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.367 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e59b32dc-121e-445d-a791-37d69ed6a844]: (4, ('Sat Oct  4 05:57:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a (2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666)\n2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666\nSat Oct  4 05:57:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a (2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666)\n2fc103e878440b9c70c0a658b855e27953a2f3ce97e9aacdff46a340e0c79666\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.370 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b63db7ec-50df-4820-84d2-35a3b97aaee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.371 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape938f61e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:57:48 np0005470441 kernel: tape938f61e-30: left promiscuous mode
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.380 2 INFO nova.compute.manager [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.380 2 DEBUG oslo.service.loopingcall [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.381 2 DEBUG nova.compute.manager [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.381 2 DEBUG nova.network.neutron [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.385 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[518251ef-05f1-4ab4-a723-bb1593dcf767]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.406 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[f1256f84-3c32-4919-934a-e770975fc07d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.407 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4b3132-0c35-4e5b-8cdf-05a8674ba5f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.420 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[817e810f-38ad-4792-8804-88ddfaa7d426]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 529247, 'reachable_time': 27409, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236609, 'error': None, 'target': 'ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 systemd[1]: run-netns-ovnmeta\x2de938f61e\x2d30e4\x2d4d66\x2db8cd\x2de55f04207f2a.mount: Deactivated successfully.
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.425 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e938f61e-30e4-4d66-b8cd-e55f04207f2a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:57:48 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:57:48.426 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[f440d10c-d9bb-49cf-b3d3-7cc5537f507e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.949 2 DEBUG nova.compute.manager [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-vif-unplugged-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.949 2 DEBUG oslo_concurrency.lockutils [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.950 2 DEBUG oslo_concurrency.lockutils [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.950 2 DEBUG oslo_concurrency.lockutils [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.950 2 DEBUG nova.compute.manager [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] No waiting events found dispatching network-vif-unplugged-b75ffdcd-dd04-4543-8be3-49803a84b890 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.950 2 DEBUG nova.compute.manager [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-vif-unplugged-b75ffdcd-dd04-4543-8be3-49803a84b890 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.950 2 DEBUG nova.compute.manager [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.951 2 DEBUG oslo_concurrency.lockutils [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.951 2 DEBUG oslo_concurrency.lockutils [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.951 2 DEBUG oslo_concurrency.lockutils [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.952 2 DEBUG nova.compute.manager [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] No waiting events found dispatching network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:57:48 np0005470441 nova_compute[192626]: 2025-10-04 05:57:48.952 2 WARNING nova.compute.manager [req-f505ae71-3312-49f4-8ce9-92c450c676e4 req-997a70bc-eecc-4de0-bcfb-5c099fc52f60 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received unexpected event network-vif-plugged-b75ffdcd-dd04-4543-8be3-49803a84b890 for instance with vm_state active and task_state deleting.#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.533 2 DEBUG nova.network.neutron [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.580 2 INFO nova.compute.manager [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Took 1.20 seconds to deallocate network for instance.#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.601 2 DEBUG nova.compute.manager [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-changed-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.602 2 DEBUG nova.compute.manager [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Refreshing instance network info cache due to event network-changed-b75ffdcd-dd04-4543-8be3-49803a84b890. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.602 2 DEBUG oslo_concurrency.lockutils [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.602 2 DEBUG oslo_concurrency.lockutils [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.603 2 DEBUG nova.network.neutron [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Refreshing network info cache for port b75ffdcd-dd04-4543-8be3-49803a84b890 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.633 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.634 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.692 2 DEBUG nova.compute.provider_tree [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.710 2 DEBUG nova.scheduler.client.report [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.735 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.765 2 DEBUG nova.network.neutron [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.770 2 INFO nova.scheduler.client.report [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance 753b29af-9c02-4eb4-a936-6c2a29a1ca6b#033[00m
Oct  4 01:57:49 np0005470441 nova_compute[192626]: 2025-10-04 05:57:49.855 2 DEBUG oslo_concurrency.lockutils [None req-cb5ce0b9-5f2f-47ac-a646-25935eae5b5c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "753b29af-9c02-4eb4-a936-6c2a29a1ca6b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:57:50 np0005470441 nova_compute[192626]: 2025-10-04 05:57:50.106 2 DEBUG nova.network.neutron [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:57:50 np0005470441 nova_compute[192626]: 2025-10-04 05:57:50.126 2 DEBUG oslo_concurrency.lockutils [req-0482d1d2-8871-405e-92e5-4624e41ae221 req-04662fb2-4e2c-4f3c-bc34-1ec9baa8f46f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-753b29af-9c02-4eb4-a936-6c2a29a1ca6b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:57:51 np0005470441 nova_compute[192626]: 2025-10-04 05:57:51.692 2 DEBUG nova.compute.manager [req-f65e55dc-0412-43f8-855e-9447699bf447 req-94a1ccfd-a72c-42b8-b5e7-aaa450a488f5 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Received event network-vif-deleted-b75ffdcd-dd04-4543-8be3-49803a84b890 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:57:52 np0005470441 podman[236610]: 2025-10-04 05:57:52.312215795 +0000 UTC m=+0.065381372 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct  4 01:57:52 np0005470441 podman[236611]: 2025-10-04 05:57:52.31240105 +0000 UTC m=+0.063315832 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:57:53 np0005470441 nova_compute[192626]: 2025-10-04 05:57:53.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:54 np0005470441 nova_compute[192626]: 2025-10-04 05:57:54.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:55 np0005470441 podman[236655]: 2025-10-04 05:57:55.319368181 +0000 UTC m=+0.061229022 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  4 01:57:55 np0005470441 podman[236654]: 2025-10-04 05:57:55.319727011 +0000 UTC m=+0.065154485 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  4 01:57:57 np0005470441 nova_compute[192626]: 2025-10-04 05:57:57.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:57 np0005470441 nova_compute[192626]: 2025-10-04 05:57:57.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:58 np0005470441 nova_compute[192626]: 2025-10-04 05:57:58.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:57:59 np0005470441 nova_compute[192626]: 2025-10-04 05:57:59.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 05:58:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 01:58:03 np0005470441 nova_compute[192626]: 2025-10-04 05:58:03.297 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759557468.2958539, 753b29af-9c02-4eb4-a936-6c2a29a1ca6b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:58:03 np0005470441 nova_compute[192626]: 2025-10-04 05:58:03.297 2 INFO nova.compute.manager [-] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] VM Stopped (Lifecycle Event)#033[00m
Oct  4 01:58:03 np0005470441 podman[236697]: 2025-10-04 05:58:03.328021212 +0000 UTC m=+0.073060163 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm)
Oct  4 01:58:03 np0005470441 nova_compute[192626]: 2025-10-04 05:58:03.334 2 DEBUG nova.compute.manager [None req-e81f701d-2b91-4c32-a788-f4fb9fb8d678 - - - - - -] [instance: 753b29af-9c02-4eb4-a936-6c2a29a1ca6b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:58:03 np0005470441 nova_compute[192626]: 2025-10-04 05:58:03.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:04 np0005470441 nova_compute[192626]: 2025-10-04 05:58:04.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:06.767 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:58:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:06.767 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:58:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:06.768 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:58:08 np0005470441 podman[236719]: 2025-10-04 05:58:08.315502433 +0000 UTC m=+0.058453893 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:58:08 np0005470441 nova_compute[192626]: 2025-10-04 05:58:08.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:09 np0005470441 nova_compute[192626]: 2025-10-04 05:58:09.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:10 np0005470441 podman[236743]: 2025-10-04 05:58:10.29535238 +0000 UTC m=+0.049374321 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  4 01:58:10 np0005470441 nova_compute[192626]: 2025-10-04 05:58:10.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:12 np0005470441 nova_compute[192626]: 2025-10-04 05:58:12.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:13 np0005470441 nova_compute[192626]: 2025-10-04 05:58:13.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:14 np0005470441 nova_compute[192626]: 2025-10-04 05:58:14.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:14 np0005470441 podman[236762]: 2025-10-04 05:58:14.336549891 +0000 UTC m=+0.092693708 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  4 01:58:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:17.631 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:91:5c 10.100.0.2 2001:db8::f816:3eff:fe8e:915c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8e:915c/64', 'neutron:device_id': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4818976-1ea3-44a2-8e1e-01d8bdb69587, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=39787b98-a5ea-4039-837e-576aea13585e) old=Port_Binding(mac=['fa:16:3e:8e:91:5c 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:58:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:17.633 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 39787b98-a5ea-4039-837e-576aea13585e in datapath 9a51c878-1020-4657-81a6-55dce1561465 updated#033[00m
Oct  4 01:58:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:17.633 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a51c878-1020-4657-81a6-55dce1561465, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:58:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:17.635 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa219f1-2bb5-4afc-a716-53f7e7ffe373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:58:18 np0005470441 nova_compute[192626]: 2025-10-04 05:58:18.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:18 np0005470441 nova_compute[192626]: 2025-10-04 05:58:18.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:18 np0005470441 nova_compute[192626]: 2025-10-04 05:58:18.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:58:19 np0005470441 nova_compute[192626]: 2025-10-04 05:58:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:19 np0005470441 nova_compute[192626]: 2025-10-04 05:58:19.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:23 np0005470441 podman[236789]: 2025-10-04 05:58:23.300378845 +0000 UTC m=+0.053920282 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:58:23 np0005470441 podman[236788]: 2025-10-04 05:58:23.307300114 +0000 UTC m=+0.062995523 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  4 01:58:23 np0005470441 nova_compute[192626]: 2025-10-04 05:58:23.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:24 np0005470441 nova_compute[192626]: 2025-10-04 05:58:24.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:24 np0005470441 nova_compute[192626]: 2025-10-04 05:58:24.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:24 np0005470441 nova_compute[192626]: 2025-10-04 05:58:24.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:58:24 np0005470441 nova_compute[192626]: 2025-10-04 05:58:24.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:58:24 np0005470441 nova_compute[192626]: 2025-10-04 05:58:24.731 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 01:58:24 np0005470441 nova_compute[192626]: 2025-10-04 05:58:24.731 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.743 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.744 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.744 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:58:25 np0005470441 podman[236833]: 2025-10-04 05:58:25.856624943 +0000 UTC m=+0.070557891 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Oct  4 01:58:25 np0005470441 podman[236834]: 2025-10-04 05:58:25.880252462 +0000 UTC m=+0.083626266 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.928 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.929 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5774MB free_disk=73.41755294799805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.929 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.929 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.978 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.978 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:58:25 np0005470441 nova_compute[192626]: 2025-10-04 05:58:25.998 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:58:26 np0005470441 nova_compute[192626]: 2025-10-04 05:58:26.010 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:58:26 np0005470441 nova_compute[192626]: 2025-10-04 05:58:26.029 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:58:26 np0005470441 nova_compute[192626]: 2025-10-04 05:58:26.029 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:58:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:26.660 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:91:5c 10.100.0.2 2001:db8:0:1:f816:3eff:fe8e:915c 2001:db8::f816:3eff:fe8e:915c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe8e:915c/64 2001:db8::f816:3eff:fe8e:915c/64', 'neutron:device_id': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4818976-1ea3-44a2-8e1e-01d8bdb69587, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=39787b98-a5ea-4039-837e-576aea13585e) old=Port_Binding(mac=['fa:16:3e:8e:91:5c 10.100.0.2 2001:db8::f816:3eff:fe8e:915c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8e:915c/64', 'neutron:device_id': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:58:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:26.661 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 39787b98-a5ea-4039-837e-576aea13585e in datapath 9a51c878-1020-4657-81a6-55dce1561465 updated#033[00m
Oct  4 01:58:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:26.662 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a51c878-1020-4657-81a6-55dce1561465, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:58:26 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:58:26.663 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[76800623-ca8b-46a4-9b3d-eef635661d40]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:58:28 np0005470441 nova_compute[192626]: 2025-10-04 05:58:28.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:29 np0005470441 nova_compute[192626]: 2025-10-04 05:58:29.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:30 np0005470441 nova_compute[192626]: 2025-10-04 05:58:30.030 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:30 np0005470441 nova_compute[192626]: 2025-10-04 05:58:30.030 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:58:33 np0005470441 nova_compute[192626]: 2025-10-04 05:58:33.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:34 np0005470441 nova_compute[192626]: 2025-10-04 05:58:34.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:34 np0005470441 podman[236871]: 2025-10-04 05:58:34.343321372 +0000 UTC m=+0.094433707 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  4 01:58:38 np0005470441 nova_compute[192626]: 2025-10-04 05:58:38.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:39 np0005470441 nova_compute[192626]: 2025-10-04 05:58:39.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:39 np0005470441 podman[236893]: 2025-10-04 05:58:39.308476251 +0000 UTC m=+0.053518261 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 01:58:41 np0005470441 podman[236918]: 2025-10-04 05:58:41.290501011 +0000 UTC m=+0.047885937 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  4 01:58:41 np0005470441 ovn_controller[94840]: 2025-10-04T05:58:41Z|00398|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  4 01:58:43 np0005470441 nova_compute[192626]: 2025-10-04 05:58:43.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:44 np0005470441 nova_compute[192626]: 2025-10-04 05:58:44.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:45 np0005470441 podman[236938]: 2025-10-04 05:58:45.322012904 +0000 UTC m=+0.075772841 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  4 01:58:48 np0005470441 nova_compute[192626]: 2025-10-04 05:58:48.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:49 np0005470441 nova_compute[192626]: 2025-10-04 05:58:49.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:53 np0005470441 nova_compute[192626]: 2025-10-04 05:58:53.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:54 np0005470441 nova_compute[192626]: 2025-10-04 05:58:54.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:54 np0005470441 podman[236964]: 2025-10-04 05:58:54.31717564 +0000 UTC m=+0.062641693 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 01:58:54 np0005470441 podman[236963]: 2025-10-04 05:58:54.337210246 +0000 UTC m=+0.080176877 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  4 01:58:56 np0005470441 podman[237008]: 2025-10-04 05:58:56.327905766 +0000 UTC m=+0.069191301 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:58:56 np0005470441 podman[237007]: 2025-10-04 05:58:56.334728212 +0000 UTC m=+0.074562456 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:58:58 np0005470441 nova_compute[192626]: 2025-10-04 05:58:58.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:58:59 np0005470441 nova_compute[192626]: 2025-10-04 05:58:59.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:03 np0005470441 nova_compute[192626]: 2025-10-04 05:59:03.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:04 np0005470441 nova_compute[192626]: 2025-10-04 05:59:04.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:05 np0005470441 podman[237048]: 2025-10-04 05:59:05.312371394 +0000 UTC m=+0.057751692 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, config_id=edpm, io.buildah.version=1.33.7)
Oct  4 01:59:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:06.768 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:06.768 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:06.768 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:07 np0005470441 nova_compute[192626]: 2025-10-04 05:59:07.919 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:07 np0005470441 nova_compute[192626]: 2025-10-04 05:59:07.920 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:07 np0005470441 nova_compute[192626]: 2025-10-04 05:59:07.944 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.028 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.029 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.035 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.036 2 INFO nova.compute.claims [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.158 2 DEBUG nova.compute.provider_tree [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.174 2 DEBUG nova.scheduler.client.report [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.202 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.203 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.253 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.253 2 DEBUG nova.network.neutron [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.276 2 INFO nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.302 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.392 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.393 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.394 2 INFO nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Creating image(s)#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.395 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.395 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.396 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.413 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.505 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.507 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.508 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.536 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.558 2 DEBUG nova.policy [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.605 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.606 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.642 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.644 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.645 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.726 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.727 2 DEBUG nova.virt.disk.api [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.728 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.791 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.792 2 DEBUG nova.virt.disk.api [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.793 2 DEBUG nova.objects.instance [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid c0d0e826-f5c3-4ab1-93d7-005e90bc4795 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.809 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.810 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Ensure instance console log exists: /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.810 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.811 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:08 np0005470441 nova_compute[192626]: 2025-10-04 05:59:08.811 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:09 np0005470441 nova_compute[192626]: 2025-10-04 05:59:09.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:09 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:09.546 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:59:09 np0005470441 nova_compute[192626]: 2025-10-04 05:59:09.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:09 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:09.548 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 01:59:09 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:09.549 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:09 np0005470441 nova_compute[192626]: 2025-10-04 05:59:09.713 2 DEBUG nova.network.neutron [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Successfully created port: e9142885-1e67-4901-a5c7-7e4fe46e60bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 01:59:10 np0005470441 podman[237085]: 2025-10-04 05:59:10.30078435 +0000 UTC m=+0.057401552 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 01:59:11 np0005470441 nova_compute[192626]: 2025-10-04 05:59:11.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:11 np0005470441 nova_compute[192626]: 2025-10-04 05:59:11.968 2 DEBUG nova.network.neutron [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Successfully updated port: e9142885-1e67-4901-a5c7-7e4fe46e60bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.081 2 DEBUG nova.compute.manager [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-changed-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.081 2 DEBUG nova.compute.manager [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Refreshing instance network info cache due to event network-changed-e9142885-1e67-4901-a5c7-7e4fe46e60bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.082 2 DEBUG oslo_concurrency.lockutils [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.082 2 DEBUG oslo_concurrency.lockutils [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.083 2 DEBUG nova.network.neutron [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Refreshing network info cache for port e9142885-1e67-4901-a5c7-7e4fe46e60bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.087 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.291 2 DEBUG nova.network.neutron [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:59:12 np0005470441 podman[237110]: 2025-10-04 05:59:12.297125533 +0000 UTC m=+0.054726395 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.545 2 DEBUG nova.network.neutron [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.572 2 DEBUG oslo_concurrency.lockutils [req-541202d8-2b44-4edb-a16a-b6e70f28d5e7 req-e3dc183c-04ab-498c-829c-719292df1a4f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.573 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.573 2 DEBUG nova.network.neutron [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 01:59:12 np0005470441 nova_compute[192626]: 2025-10-04 05:59:12.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:13 np0005470441 nova_compute[192626]: 2025-10-04 05:59:13.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:13 np0005470441 nova_compute[192626]: 2025-10-04 05:59:13.508 2 DEBUG nova.network.neutron [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 01:59:14 np0005470441 nova_compute[192626]: 2025-10-04 05:59:14.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.025 2 DEBUG nova.network.neutron [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updating instance_info_cache with network_info: [{"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.171 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.171 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Instance network_info: |[{"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.174 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Start _get_guest_xml network_info=[{"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.180 2 WARNING nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.196 2 DEBUG nova.virt.libvirt.host [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.197 2 DEBUG nova.virt.libvirt.host [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.201 2 DEBUG nova.virt.libvirt.host [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.202 2 DEBUG nova.virt.libvirt.host [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.204 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.204 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.205 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.205 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.205 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.206 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.206 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.206 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.207 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.207 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.207 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.208 2 DEBUG nova.virt.hardware [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.213 2 DEBUG nova.virt.libvirt.vif [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:59:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2024916855',display_name='tempest-TestGettingAddress-server-2024916855',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2024916855',id=55,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPwwPzJFadC2kwfz/S/TB9vuzrRhe2/Ej9szXyfe7fb5VaSXehvkg5LpA20Ygyd/hSbHAW3IL/pA1IeguZ262tZ7EeN0xj15osBk5FplUhU+uFhiERPpfxZA3K9B3mylLQ==',key_name='tempest-TestGettingAddress-44626960',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0mfrd0im',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:59:08Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=c0d0e826-f5c3-4ab1-93d7-005e90bc4795,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.213 2 DEBUG nova.network.os_vif_util [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.214 2 DEBUG nova.network.os_vif_util [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.215 2 DEBUG nova.objects.instance [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid c0d0e826-f5c3-4ab1-93d7-005e90bc4795 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.318 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] End _get_guest_xml xml=<domain type="kvm">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <uuid>c0d0e826-f5c3-4ab1-93d7-005e90bc4795</uuid>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <name>instance-00000037</name>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-2024916855</nova:name>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 05:59:16</nova:creationTime>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        <nova:port uuid="e9142885-1e67-4901-a5c7-7e4fe46e60bf">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe27:3deb" ipVersion="6"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe27:3deb" ipVersion="6"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <system>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <entry name="serial">c0d0e826-f5c3-4ab1-93d7-005e90bc4795</entry>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <entry name="uuid">c0d0e826-f5c3-4ab1-93d7-005e90bc4795</entry>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </system>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <os>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </os>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <features>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </features>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </clock>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  <devices>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.config"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </disk>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:27:3d:eb"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <target dev="tape9142885-1e"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </interface>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/console.log" append="off"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </serial>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <video>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </video>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </rng>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 01:59:16 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 01:59:16 np0005470441 nova_compute[192626]:  </devices>
Oct  4 01:59:16 np0005470441 nova_compute[192626]: </domain>
Oct  4 01:59:16 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.319 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Preparing to wait for external event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.320 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.320 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.321 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.322 2 DEBUG nova.virt.libvirt.vif [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T05:59:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2024916855',display_name='tempest-TestGettingAddress-server-2024916855',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2024916855',id=55,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPwwPzJFadC2kwfz/S/TB9vuzrRhe2/Ej9szXyfe7fb5VaSXehvkg5LpA20Ygyd/hSbHAW3IL/pA1IeguZ262tZ7EeN0xj15osBk5FplUhU+uFhiERPpfxZA3K9B3mylLQ==',key_name='tempest-TestGettingAddress-44626960',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0mfrd0im',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T05:59:08Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=c0d0e826-f5c3-4ab1-93d7-005e90bc4795,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.323 2 DEBUG nova.network.os_vif_util [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.325 2 DEBUG nova.network.os_vif_util [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.326 2 DEBUG os_vif [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.327 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.328 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.332 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9142885-1e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.333 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape9142885-1e, col_values=(('external_ids', {'iface-id': 'e9142885-1e67-4901-a5c7-7e4fe46e60bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:3d:eb', 'vm-uuid': 'c0d0e826-f5c3-4ab1-93d7-005e90bc4795'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:16 np0005470441 NetworkManager[51690]: <info>  [1759557556.3373] manager: (tape9142885-1e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/196)
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.343 2 INFO os_vif [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e')#033[00m
Oct  4 01:59:16 np0005470441 podman[237129]: 2025-10-04 05:59:16.377298805 +0000 UTC m=+0.132990556 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.538 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.539 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.539 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:27:3d:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 01:59:16 np0005470441 nova_compute[192626]: 2025-10-04 05:59:16.540 2 INFO nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Using config drive#033[00m
Oct  4 01:59:17 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.670 2 INFO nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Creating config drive at /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.config#033[00m
Oct  4 01:59:17 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.679 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mi5cow4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:17 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.812 2 DEBUG oslo_concurrency.processutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mi5cow4" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:17 np0005470441 kernel: tape9142885-1e: entered promiscuous mode
Oct  4 01:59:17 np0005470441 NetworkManager[51690]: <info>  [1759557557.8773] manager: (tape9142885-1e): new Tun device (/org/freedesktop/NetworkManager/Devices/197)
Oct  4 01:59:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:17Z|00399|binding|INFO|Claiming lport e9142885-1e67-4901-a5c7-7e4fe46e60bf for this chassis.
Oct  4 01:59:17 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:17Z|00400|binding|INFO|e9142885-1e67-4901-a5c7-7e4fe46e60bf: Claiming fa:16:3e:27:3d:eb 10.100.0.3 2001:db8:0:1:f816:3eff:fe27:3deb 2001:db8::f816:3eff:fe27:3deb
Oct  4 01:59:17 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:17 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.890 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:17 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:17 np0005470441 NetworkManager[51690]: <info>  [1759557557.8980] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Oct  4 01:59:17 np0005470441 NetworkManager[51690]: <info>  [1759557557.8994] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.905 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:3d:eb 10.100.0.3 2001:db8:0:1:f816:3eff:fe27:3deb 2001:db8::f816:3eff:fe27:3deb'], port_security=['fa:16:3e:27:3d:eb 10.100.0.3 2001:db8:0:1:f816:3eff:fe27:3deb 2001:db8::f816:3eff:fe27:3deb'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8:0:1:f816:3eff:fe27:3deb/64 2001:db8::f816:3eff:fe27:3deb/64', 'neutron:device_id': 'c0d0e826-f5c3-4ab1-93d7-005e90bc4795', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c0b1059d-77a3-4b65-956a-9ac3f36558c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4818976-1ea3-44a2-8e1e-01d8bdb69587, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=e9142885-1e67-4901-a5c7-7e4fe46e60bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.906 103689 INFO neutron.agent.ovn.metadata.agent [-] Port e9142885-1e67-4901-a5c7-7e4fe46e60bf in datapath 9a51c878-1020-4657-81a6-55dce1561465 bound to our chassis#033[00m
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.907 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9a51c878-1020-4657-81a6-55dce1561465#033[00m
Oct  4 01:59:17 np0005470441 systemd-udevd[237174]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.925 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[01589c53-c649-40cf-bf25-37e18f1eb356]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.926 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9a51c878-11 in ovnmeta-9a51c878-1020-4657-81a6-55dce1561465 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 01:59:17 np0005470441 systemd-machined[152624]: New machine qemu-30-instance-00000037.
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.927 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9a51c878-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.927 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[6cc18a90-e155-4d12-aef8-fe0c49fe37e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:17 np0005470441 NetworkManager[51690]: <info>  [1759557557.9306] device (tape9142885-1e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.930 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[a241082c-7f14-4963-ac25-0a906e8e135a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:17 np0005470441 NetworkManager[51690]: <info>  [1759557557.9319] device (tape9142885-1e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.943 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[95d4a89c-fa50-4b6c-9646-2780f2708b16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:17 np0005470441 systemd[1]: Started Virtual Machine qemu-30-instance-00000037.
Oct  4 01:59:17 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:17.989 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[bb60ae90-fcf3-4869-ab0f-3f61be369683]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:17.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:18Z|00401|binding|INFO|Setting lport e9142885-1e67-4901-a5c7-7e4fe46e60bf ovn-installed in OVS
Oct  4 01:59:18 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:18Z|00402|binding|INFO|Setting lport e9142885-1e67-4901-a5c7-7e4fe46e60bf up in Southbound
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.032 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[27091974-483a-4496-a73e-1fba254f1c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.038 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[de5ee8e9-b607-4227-a039-f53eed3d4d0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 NetworkManager[51690]: <info>  [1759557558.0390] manager: (tap9a51c878-10): new Veth device (/org/freedesktop/NetworkManager/Devices/200)
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.072 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[67d0804d-4a53-457d-aa54-0e7da91383c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.076 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[a3478e16-63b0-4ed9-b2a8-7dadce51e1a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 NetworkManager[51690]: <info>  [1759557558.1000] device (tap9a51c878-10): carrier: link connected
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.107 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[1662015a-4d49-4cf7-b321-88fb288e2e26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.128 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9b9da214-97d3-427f-9c83-fa6c4d31c488]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a51c878-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:91:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545575, 'reachable_time': 31042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237208, 'error': None, 'target': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.143 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[529ba703-87af-4bef-aca2-446af180bc99]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:915c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 545575, 'tstamp': 545575}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237209, 'error': None, 'target': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.165 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[136d81aa-bdcd-4dad-b5e1-1341428e139c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9a51c878-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:91:5c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545575, 'reachable_time': 31042, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237210, 'error': None, 'target': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.197 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[9dac0f72-4ed1-46eb-9549-9fab95271927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.269 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[58ae4d98-0f62-41ae-9996-6c677ee739d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.271 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a51c878-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.272 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.273 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9a51c878-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 NetworkManager[51690]: <info>  [1759557558.2761] manager: (tap9a51c878-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/201)
Oct  4 01:59:18 np0005470441 kernel: tap9a51c878-10: entered promiscuous mode
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.281 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9a51c878-10, col_values=(('external_ids', {'iface-id': '39787b98-a5ea-4039-837e-576aea13585e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:18Z|00403|binding|INFO|Releasing lport 39787b98-a5ea-4039-837e-576aea13585e from this chassis (sb_readonly=0)
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.309 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9a51c878-1020-4657-81a6-55dce1561465.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9a51c878-1020-4657-81a6-55dce1561465.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.311 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[8954f8a5-cae3-4f74-ab05-f0c70ab16023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.312 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-9a51c878-1020-4657-81a6-55dce1561465
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/9a51c878-1020-4657-81a6-55dce1561465.pid.haproxy
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 9a51c878-1020-4657-81a6-55dce1561465
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 01:59:18 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:18.316 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'env', 'PROCESS_TAG=haproxy-9a51c878-1020-4657-81a6-55dce1561465', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9a51c878-1020-4657-81a6-55dce1561465.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 01:59:18 np0005470441 podman[237249]: 2025-10-04 05:59:18.734673512 +0000 UTC m=+0.063946240 container create 0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  4 01:59:18 np0005470441 systemd[1]: Started libpod-conmon-0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5.scope.
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.784 2 DEBUG nova.compute.manager [req-34eb54bd-aaba-4f85-8946-d0df2fa938c6 req-19c2eb57-5877-435f-943a-a7f51195de01 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.785 2 DEBUG oslo_concurrency.lockutils [req-34eb54bd-aaba-4f85-8946-d0df2fa938c6 req-19c2eb57-5877-435f-943a-a7f51195de01 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.785 2 DEBUG oslo_concurrency.lockutils [req-34eb54bd-aaba-4f85-8946-d0df2fa938c6 req-19c2eb57-5877-435f-943a-a7f51195de01 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.786 2 DEBUG oslo_concurrency.lockutils [req-34eb54bd-aaba-4f85-8946-d0df2fa938c6 req-19c2eb57-5877-435f-943a-a7f51195de01 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:18 np0005470441 nova_compute[192626]: 2025-10-04 05:59:18.787 2 DEBUG nova.compute.manager [req-34eb54bd-aaba-4f85-8946-d0df2fa938c6 req-19c2eb57-5877-435f-943a-a7f51195de01 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Processing event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 01:59:18 np0005470441 podman[237249]: 2025-10-04 05:59:18.696128423 +0000 UTC m=+0.025401141 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 01:59:18 np0005470441 systemd[1]: Started libcrun container.
Oct  4 01:59:18 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2bfb802e390c9ce6c5e1899cab2216fb2e059d58bbe4765cd7fc7bdc798208c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 01:59:18 np0005470441 podman[237249]: 2025-10-04 05:59:18.8204903 +0000 UTC m=+0.149763038 container init 0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:59:18 np0005470441 podman[237249]: 2025-10-04 05:59:18.828413728 +0000 UTC m=+0.157686436 container start 0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 01:59:18 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [NOTICE]   (237269) : New worker (237271) forked
Oct  4 01:59:18 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [NOTICE]   (237269) : Loading success.
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.006 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557559.0057433, c0d0e826-f5c3-4ab1-93d7-005e90bc4795 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.007 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] VM Started (Lifecycle Event)#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.010 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.014 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.018 2 INFO nova.virt.libvirt.driver [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Instance spawned successfully.#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.019 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.139 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.149 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.156 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.156 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.157 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.158 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.158 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.159 2 DEBUG nova.virt.libvirt.driver [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.488 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.488 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557559.0059125, c0d0e826-f5c3-4ab1-93d7-005e90bc4795 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.489 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] VM Paused (Lifecycle Event)#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.520 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.523 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557559.0131824, c0d0e826-f5c3-4ab1-93d7-005e90bc4795 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.524 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] VM Resumed (Lifecycle Event)#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.531 2 INFO nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Took 11.14 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.532 2 DEBUG nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.564 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.567 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.599 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.619 2 INFO nova.compute.manager [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Took 11.62 seconds to build instance.#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.643 2 DEBUG oslo_concurrency.lockutils [None req-bbaa5082-d5bc-437c-9f64-c26e6057272c 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:19 np0005470441 nova_compute[192626]: 2025-10-04 05:59:19.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.877 2 DEBUG nova.compute.manager [req-62e26a93-6ad6-4e64-a089-28a761d3f465 req-12a5da87-c6d8-47dd-b01b-2e25b7cec0c7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.878 2 DEBUG oslo_concurrency.lockutils [req-62e26a93-6ad6-4e64-a089-28a761d3f465 req-12a5da87-c6d8-47dd-b01b-2e25b7cec0c7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.878 2 DEBUG oslo_concurrency.lockutils [req-62e26a93-6ad6-4e64-a089-28a761d3f465 req-12a5da87-c6d8-47dd-b01b-2e25b7cec0c7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.879 2 DEBUG oslo_concurrency.lockutils [req-62e26a93-6ad6-4e64-a089-28a761d3f465 req-12a5da87-c6d8-47dd-b01b-2e25b7cec0c7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.879 2 DEBUG nova.compute.manager [req-62e26a93-6ad6-4e64-a089-28a761d3f465 req-12a5da87-c6d8-47dd-b01b-2e25b7cec0c7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] No waiting events found dispatching network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:59:20 np0005470441 nova_compute[192626]: 2025-10-04 05:59:20.879 2 WARNING nova.compute.manager [req-62e26a93-6ad6-4e64-a089-28a761d3f465 req-12a5da87-c6d8-47dd-b01b-2e25b7cec0c7 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received unexpected event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf for instance with vm_state active and task_state None.#033[00m
Oct  4 01:59:21 np0005470441 nova_compute[192626]: 2025-10-04 05:59:21.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:24 np0005470441 nova_compute[192626]: 2025-10-04 05:59:24.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:25 np0005470441 podman[237281]: 2025-10-04 05:59:25.329883576 +0000 UTC m=+0.073766413 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:59:25 np0005470441 podman[237280]: 2025-10-04 05:59:25.339471192 +0000 UTC m=+0.084306016 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd)
Oct  4 01:59:25 np0005470441 nova_compute[192626]: 2025-10-04 05:59:25.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:25 np0005470441 nova_compute[192626]: 2025-10-04 05:59:25.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 01:59:25 np0005470441 nova_compute[192626]: 2025-10-04 05:59:25.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.514 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.514 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.515 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.515 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c0d0e826-f5c3-4ab1-93d7-005e90bc4795 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.708 2 DEBUG nova.compute.manager [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-changed-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.708 2 DEBUG nova.compute.manager [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Refreshing instance network info cache due to event network-changed-e9142885-1e67-4901-a5c7-7e4fe46e60bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:59:26 np0005470441 nova_compute[192626]: 2025-10-04 05:59:26.709 2 DEBUG oslo_concurrency.lockutils [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:59:27 np0005470441 podman[237324]: 2025-10-04 05:59:27.311549207 +0000 UTC m=+0.066143484 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 01:59:27 np0005470441 podman[237323]: 2025-10-04 05:59:27.318210138 +0000 UTC m=+0.072256149 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.716 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updating instance_info_cache with network_info: [{"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.740 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.740 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.740 2 DEBUG oslo_concurrency.lockutils [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.741 2 DEBUG nova.network.neutron [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Refreshing network info cache for port e9142885-1e67-4901-a5c7-7e4fe46e60bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.742 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.742 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.792 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.793 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.793 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.793 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.872 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.945 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:28 np0005470441 nova_compute[192626]: 2025-10-04 05:59:28.946 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.001 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.168 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.169 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5565MB free_disk=73.4167366027832GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.169 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.169 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.247 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance c0d0e826-f5c3-4ab1-93d7-005e90bc4795 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.247 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.248 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.327 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.350 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.372 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 01:59:29 np0005470441 nova_compute[192626]: 2025-10-04 05:59:29.372 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:30 np0005470441 nova_compute[192626]: 2025-10-04 05:59:30.346 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:30 np0005470441 nova_compute[192626]: 2025-10-04 05:59:30.368 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:31 np0005470441 nova_compute[192626]: 2025-10-04 05:59:31.183 2 DEBUG nova.network.neutron [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updated VIF entry in instance network info cache for port e9142885-1e67-4901-a5c7-7e4fe46e60bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:59:31 np0005470441 nova_compute[192626]: 2025-10-04 05:59:31.184 2 DEBUG nova.network.neutron [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updating instance_info_cache with network_info: [{"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:59:31 np0005470441 nova_compute[192626]: 2025-10-04 05:59:31.206 2 DEBUG oslo_concurrency.lockutils [req-750a1535-fc24-43a5-a6d4-66153dd7971d req-d46fac95-78e1-437b-973f-6dde433d7450 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:59:31 np0005470441 nova_compute[192626]: 2025-10-04 05:59:31.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:31 np0005470441 nova_compute[192626]: 2025-10-04 05:59:31.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 01:59:31 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:31Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:27:3d:eb 10.100.0.3
Oct  4 01:59:31 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:31Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:27:3d:eb 10.100.0.3
Oct  4 01:59:34 np0005470441 nova_compute[192626]: 2025-10-04 05:59:34.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:36 np0005470441 podman[237380]: 2025-10-04 05:59:36.306625411 +0000 UTC m=+0.063840668 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git)
Oct  4 01:59:36 np0005470441 nova_compute[192626]: 2025-10-04 05:59:36.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:39 np0005470441 nova_compute[192626]: 2025-10-04 05:59:39.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:41 np0005470441 podman[237402]: 2025-10-04 05:59:41.302656788 +0000 UTC m=+0.056242759 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 01:59:41 np0005470441 nova_compute[192626]: 2025-10-04 05:59:41.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:43 np0005470441 podman[237426]: 2025-10-04 05:59:43.320429806 +0000 UTC m=+0.072615879 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  4 01:59:44 np0005470441 nova_compute[192626]: 2025-10-04 05:59:44.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:44 np0005470441 nova_compute[192626]: 2025-10-04 05:59:44.921 2 DEBUG nova.compute.manager [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-changed-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:44 np0005470441 nova_compute[192626]: 2025-10-04 05:59:44.922 2 DEBUG nova.compute.manager [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Refreshing instance network info cache due to event network-changed-e9142885-1e67-4901-a5c7-7e4fe46e60bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 01:59:44 np0005470441 nova_compute[192626]: 2025-10-04 05:59:44.922 2 DEBUG oslo_concurrency.lockutils [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 01:59:44 np0005470441 nova_compute[192626]: 2025-10-04 05:59:44.922 2 DEBUG oslo_concurrency.lockutils [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 01:59:44 np0005470441 nova_compute[192626]: 2025-10-04 05:59:44.923 2 DEBUG nova.network.neutron [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Refreshing network info cache for port e9142885-1e67-4901-a5c7-7e4fe46e60bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.137 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.137 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.138 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.138 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.138 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.140 2 INFO nova.compute.manager [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Terminating instance#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.141 2 DEBUG nova.compute.manager [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 01:59:45 np0005470441 kernel: tape9142885-1e (unregistering): left promiscuous mode
Oct  4 01:59:45 np0005470441 NetworkManager[51690]: <info>  [1759557585.1617] device (tape9142885-1e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 01:59:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:45Z|00404|binding|INFO|Releasing lport e9142885-1e67-4901-a5c7-7e4fe46e60bf from this chassis (sb_readonly=0)
Oct  4 01:59:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:45Z|00405|binding|INFO|Setting lport e9142885-1e67-4901-a5c7-7e4fe46e60bf down in Southbound
Oct  4 01:59:45 np0005470441 ovn_controller[94840]: 2025-10-04T05:59:45Z|00406|binding|INFO|Removing iface tape9142885-1e ovn-installed in OVS
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.203 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:3d:eb 10.100.0.3 2001:db8:0:1:f816:3eff:fe27:3deb 2001:db8::f816:3eff:fe27:3deb'], port_security=['fa:16:3e:27:3d:eb 10.100.0.3 2001:db8:0:1:f816:3eff:fe27:3deb 2001:db8::f816:3eff:fe27:3deb'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8:0:1:f816:3eff:fe27:3deb/64 2001:db8::f816:3eff:fe27:3deb/64', 'neutron:device_id': 'c0d0e826-f5c3-4ab1-93d7-005e90bc4795', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9a51c878-1020-4657-81a6-55dce1561465', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c0b1059d-77a3-4b65-956a-9ac3f36558c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4818976-1ea3-44a2-8e1e-01d8bdb69587, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=e9142885-1e67-4901-a5c7-7e4fe46e60bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.204 103689 INFO neutron.agent.ovn.metadata.agent [-] Port e9142885-1e67-4901-a5c7-7e4fe46e60bf in datapath 9a51c878-1020-4657-81a6-55dce1561465 unbound from our chassis#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.205 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9a51c878-1020-4657-81a6-55dce1561465, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.206 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[50a2a972-9766-4dd2-887e-8936282c5dca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.207 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9a51c878-1020-4657-81a6-55dce1561465 namespace which is not needed anymore#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000037.scope: Deactivated successfully.
Oct  4 01:59:45 np0005470441 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d00000037.scope: Consumed 14.054s CPU time.
Oct  4 01:59:45 np0005470441 systemd-machined[152624]: Machine qemu-30-instance-00000037 terminated.
Oct  4 01:59:45 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [NOTICE]   (237269) : haproxy version is 2.8.14-c23fe91
Oct  4 01:59:45 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [NOTICE]   (237269) : path to executable is /usr/sbin/haproxy
Oct  4 01:59:45 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [WARNING]  (237269) : Exiting Master process...
Oct  4 01:59:45 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [ALERT]    (237269) : Current worker (237271) exited with code 143 (Terminated)
Oct  4 01:59:45 np0005470441 neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465[237265]: [WARNING]  (237269) : All workers exited. Exiting... (0)
Oct  4 01:59:45 np0005470441 systemd[1]: libpod-0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5.scope: Deactivated successfully.
Oct  4 01:59:45 np0005470441 podman[237470]: 2025-10-04 05:59:45.37041079 +0000 UTC m=+0.057647559 container died 0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.410 2 INFO nova.virt.libvirt.driver [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Instance destroyed successfully.#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.410 2 DEBUG nova.objects.instance [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid c0d0e826-f5c3-4ab1-93d7-005e90bc4795 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 01:59:45 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5-userdata-shm.mount: Deactivated successfully.
Oct  4 01:59:45 np0005470441 systemd[1]: var-lib-containers-storage-overlay-d2bfb802e390c9ce6c5e1899cab2216fb2e059d58bbe4765cd7fc7bdc798208c-merged.mount: Deactivated successfully.
Oct  4 01:59:45 np0005470441 podman[237470]: 2025-10-04 05:59:45.4215257 +0000 UTC m=+0.108762489 container cleanup 0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:59:45 np0005470441 systemd[1]: libpod-conmon-0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5.scope: Deactivated successfully.
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.429 2 DEBUG nova.virt.libvirt.vif [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T05:59:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2024916855',display_name='tempest-TestGettingAddress-server-2024916855',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2024916855',id=55,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPwwPzJFadC2kwfz/S/TB9vuzrRhe2/Ej9szXyfe7fb5VaSXehvkg5LpA20Ygyd/hSbHAW3IL/pA1IeguZ262tZ7EeN0xj15osBk5FplUhU+uFhiERPpfxZA3K9B3mylLQ==',key_name='tempest-TestGettingAddress-44626960',keypairs=<?>,launch_index=0,launched_at=2025-10-04T05:59:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-0mfrd0im',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T05:59:19Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=c0d0e826-f5c3-4ab1-93d7-005e90bc4795,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.429 2 DEBUG nova.network.os_vif_util [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.430 2 DEBUG nova.network.os_vif_util [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.430 2 DEBUG os_vif [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.433 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9142885-1e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.438 2 INFO os_vif [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:27:3d:eb,bridge_name='br-int',has_traffic_filtering=True,id=e9142885-1e67-4901-a5c7-7e4fe46e60bf,network=Network(9a51c878-1020-4657-81a6-55dce1561465),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9142885-1e')#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.439 2 INFO nova.virt.libvirt.driver [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Deleting instance files /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795_del#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.440 2 INFO nova.virt.libvirt.driver [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Deletion of /var/lib/nova/instances/c0d0e826-f5c3-4ab1-93d7-005e90bc4795_del complete#033[00m
Oct  4 01:59:45 np0005470441 podman[237517]: 2025-10-04 05:59:45.488276621 +0000 UTC m=+0.040321701 container remove 0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.498 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[61951619-abe0-49e8-a264-fcd6c07008f5]: (4, ('Sat Oct  4 05:59:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465 (0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5)\n0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5\nSat Oct  4 05:59:45 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9a51c878-1020-4657-81a6-55dce1561465 (0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5)\n0441a2113e4977ed3a3ed8819679f2967a0a8cc2283fec9bac3e845a422f1ba5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.500 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[eee411e4-b2dc-4a76-988e-2a7a49409317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.501 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9a51c878-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 kernel: tap9a51c878-10: left promiscuous mode
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.524 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[caa89d9e-af01-4b89-b3fc-d23c6c612ddc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.545 2 INFO nova.compute.manager [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.546 2 DEBUG oslo.service.loopingcall [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.546 2 DEBUG nova.compute.manager [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.546 2 DEBUG nova.network.neutron [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.554 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5e4c0b-ac7a-4da6-a92a-5160a9c99fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.556 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3f83290f-2575-40d1-9c52-420a966f3cfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.578 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d29dd260-e766-459a-a493-1ada9b8c2eac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 545568, 'reachable_time': 21377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237532, 'error': None, 'target': 'ovnmeta-9a51c878-1020-4657-81a6-55dce1561465', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.583 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9a51c878-1020-4657-81a6-55dce1561465 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 01:59:45 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 05:59:45.583 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[77eff133-f418-405f-99f9-60952a13eb96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 01:59:45 np0005470441 systemd[1]: run-netns-ovnmeta\x2d9a51c878\x2d1020\x2d4657\x2d81a6\x2d55dce1561465.mount: Deactivated successfully.
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.823 2 DEBUG nova.compute.manager [req-147b446a-a4e2-49ea-9de2-89de77259e7d req-84c44560-b613-431b-9ae6-13384fd45c45 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-vif-unplugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.824 2 DEBUG oslo_concurrency.lockutils [req-147b446a-a4e2-49ea-9de2-89de77259e7d req-84c44560-b613-431b-9ae6-13384fd45c45 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.825 2 DEBUG oslo_concurrency.lockutils [req-147b446a-a4e2-49ea-9de2-89de77259e7d req-84c44560-b613-431b-9ae6-13384fd45c45 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.826 2 DEBUG oslo_concurrency.lockutils [req-147b446a-a4e2-49ea-9de2-89de77259e7d req-84c44560-b613-431b-9ae6-13384fd45c45 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.826 2 DEBUG nova.compute.manager [req-147b446a-a4e2-49ea-9de2-89de77259e7d req-84c44560-b613-431b-9ae6-13384fd45c45 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] No waiting events found dispatching network-vif-unplugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:59:45 np0005470441 nova_compute[192626]: 2025-10-04 05:59:45.827 2 DEBUG nova.compute.manager [req-147b446a-a4e2-49ea-9de2-89de77259e7d req-84c44560-b613-431b-9ae6-13384fd45c45 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-vif-unplugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.088 2 DEBUG nova.network.neutron [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.144 2 INFO nova.compute.manager [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Took 1.60 seconds to deallocate network for instance.#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.208 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.208 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.265 2 DEBUG nova.compute.provider_tree [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.318 2 DEBUG nova.scheduler.client.report [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.345 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:47 np0005470441 podman[237533]: 2025-10-04 05:59:47.384456343 +0000 UTC m=+0.125338767 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.392 2 INFO nova.scheduler.client.report [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance c0d0e826-f5c3-4ab1-93d7-005e90bc4795#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.522 2 DEBUG oslo_concurrency.lockutils [None req-9ed00b95-5088-4d54-b383-c040a0ba5ea9 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.693 2 DEBUG nova.network.neutron [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updated VIF entry in instance network info cache for port e9142885-1e67-4901-a5c7-7e4fe46e60bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.694 2 DEBUG nova.network.neutron [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Updating instance_info_cache with network_info: [{"id": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "address": "fa:16:3e:27:3d:eb", "network": {"id": "9a51c878-1020-4657-81a6-55dce1561465", "bridge": "br-int", "label": "tempest-network-smoke--511273666", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe27:3deb", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9142885-1e", "ovs_interfaceid": "e9142885-1e67-4901-a5c7-7e4fe46e60bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.764 2 DEBUG oslo_concurrency.lockutils [req-11b6e91b-d407-4bed-8c7e-c61dd32c1c57 req-29838289-2316-4901-9d0f-a57cff306395 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-c0d0e826-f5c3-4ab1-93d7-005e90bc4795" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.913 2 DEBUG nova.compute.manager [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.913 2 DEBUG oslo_concurrency.lockutils [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.913 2 DEBUG oslo_concurrency.lockutils [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.914 2 DEBUG oslo_concurrency.lockutils [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "c0d0e826-f5c3-4ab1-93d7-005e90bc4795-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.914 2 DEBUG nova.compute.manager [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] No waiting events found dispatching network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.914 2 WARNING nova.compute.manager [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received unexpected event network-vif-plugged-e9142885-1e67-4901-a5c7-7e4fe46e60bf for instance with vm_state deleted and task_state None.#033[00m
Oct  4 01:59:47 np0005470441 nova_compute[192626]: 2025-10-04 05:59:47.914 2 DEBUG nova.compute.manager [req-3b92a674-d712-4375-bfbf-1c3758d8ab0a req-acb5dc97-cd6b-49f4-8a32-de86eb0468c2 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Received event network-vif-deleted-e9142885-1e67-4901-a5c7-7e4fe46e60bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 01:59:49 np0005470441 nova_compute[192626]: 2025-10-04 05:59:49.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:50 np0005470441 nova_compute[192626]: 2025-10-04 05:59:50.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:54 np0005470441 nova_compute[192626]: 2025-10-04 05:59:54.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:55 np0005470441 nova_compute[192626]: 2025-10-04 05:59:55.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:56 np0005470441 podman[237560]: 2025-10-04 05:59:56.320403125 +0000 UTC m=+0.072459185 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  4 01:59:56 np0005470441 podman[237561]: 2025-10-04 05:59:56.339056882 +0000 UTC m=+0.073876456 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 01:59:58 np0005470441 nova_compute[192626]: 2025-10-04 05:59:58.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:58 np0005470441 nova_compute[192626]: 2025-10-04 05:59:58.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 01:59:58 np0005470441 podman[237609]: 2025-10-04 05:59:58.216405332 +0000 UTC m=+0.052313136 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 01:59:58 np0005470441 podman[237611]: 2025-10-04 05:59:58.241122693 +0000 UTC m=+0.073912187 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  4 01:59:59 np0005470441 nova_compute[192626]: 2025-10-04 05:59:59.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:00 np0005470441 nova_compute[192626]: 2025-10-04 06:00:00.409 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759557585.4064803, c0d0e826-f5c3-4ab1-93d7-005e90bc4795 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 02:00:00 np0005470441 nova_compute[192626]: 2025-10-04 06:00:00.409 2 INFO nova.compute.manager [-] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] VM Stopped (Lifecycle Event)#033[00m
Oct  4 02:00:00 np0005470441 nova_compute[192626]: 2025-10-04 06:00:00.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:00 np0005470441 nova_compute[192626]: 2025-10-04 06:00:00.444 2 DEBUG nova.compute.manager [None req-02f7af8b-84c3-4b03-9348-32680f812abf - - - - - -] [instance: c0d0e826-f5c3-4ab1-93d7-005e90bc4795] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.717 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.718 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:00:02.719 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:00:04 np0005470441 nova_compute[192626]: 2025-10-04 06:00:04.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:05 np0005470441 nova_compute[192626]: 2025-10-04 06:00:05.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:06.769 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:00:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:06.770 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:00:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:06.770 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:00:07 np0005470441 podman[237650]: 2025-10-04 06:00:07.315056803 +0000 UTC m=+0.071376214 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, name=ubi9-minimal, managed_by=edpm_ansible, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 02:00:09 np0005470441 nova_compute[192626]: 2025-10-04 06:00:09.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:09 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:09.937 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 02:00:09 np0005470441 nova_compute[192626]: 2025-10-04 06:00:09.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:09 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:09.939 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 02:00:10 np0005470441 nova_compute[192626]: 2025-10-04 06:00:10.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:12 np0005470441 podman[237673]: 2025-10-04 06:00:12.296157509 +0000 UTC m=+0.050967447 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  4 02:00:12 np0005470441 nova_compute[192626]: 2025-10-04 06:00:12.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:13 np0005470441 nova_compute[192626]: 2025-10-04 06:00:13.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:13 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:13.941 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:00:14 np0005470441 nova_compute[192626]: 2025-10-04 06:00:14.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:14 np0005470441 podman[237697]: 2025-10-04 06:00:14.328083575 +0000 UTC m=+0.073843795 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  4 02:00:15 np0005470441 nova_compute[192626]: 2025-10-04 06:00:15.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:15.448 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:41:4c:b9 10.100.0.2 2001:db8::f816:3eff:fe41:4cb9'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe41:4cb9/64', 'neutron:device_id': 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7cf19c94-4304-4371-b395-c2514f30f6bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08a3064d-ad51-4659-9426-9641fbf843fd, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=191b7d1e-368e-4ed6-9fd7-fb99ed92e607) old=Port_Binding(mac=['fa:16:3e:41:4c:b9 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7cf19c94-4304-4371-b395-c2514f30f6bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 02:00:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:15.449 103689 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 191b7d1e-368e-4ed6-9fd7-fb99ed92e607 in datapath 7cf19c94-4304-4371-b395-c2514f30f6bc updated#033[00m
Oct  4 02:00:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:15.450 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7cf19c94-4304-4371-b395-c2514f30f6bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 02:00:15 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:00:15.452 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee00e21-86c2-48b4-85a3-49232bb5bbe2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:00:18 np0005470441 podman[237717]: 2025-10-04 06:00:18.344778031 +0000 UTC m=+0.095768756 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 02:00:19 np0005470441 nova_compute[192626]: 2025-10-04 06:00:19.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:19 np0005470441 nova_compute[192626]: 2025-10-04 06:00:19.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:20 np0005470441 nova_compute[192626]: 2025-10-04 06:00:20.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:21 np0005470441 nova_compute[192626]: 2025-10-04 06:00:21.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:21 np0005470441 nova_compute[192626]: 2025-10-04 06:00:21.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 02:00:22 np0005470441 nova_compute[192626]: 2025-10-04 06:00:22.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:22 np0005470441 nova_compute[192626]: 2025-10-04 06:00:22.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  4 02:00:24 np0005470441 nova_compute[192626]: 2025-10-04 06:00:24.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:25 np0005470441 nova_compute[192626]: 2025-10-04 06:00:25.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:26 np0005470441 nova_compute[192626]: 2025-10-04 06:00:26.997 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:26 np0005470441 nova_compute[192626]: 2025-10-04 06:00:26.997 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 02:00:26 np0005470441 nova_compute[192626]: 2025-10-04 06:00:26.997 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 02:00:27 np0005470441 nova_compute[192626]: 2025-10-04 06:00:27.015 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 02:00:27 np0005470441 nova_compute[192626]: 2025-10-04 06:00:27.016 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:27 np0005470441 podman[237746]: 2025-10-04 06:00:27.314800124 +0000 UTC m=+0.062347525 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 02:00:27 np0005470441 podman[237745]: 2025-10-04 06:00:27.31917857 +0000 UTC m=+0.067585885 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 02:00:28 np0005470441 nova_compute[192626]: 2025-10-04 06:00:28.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:28 np0005470441 nova_compute[192626]: 2025-10-04 06:00:28.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  4 02:00:28 np0005470441 nova_compute[192626]: 2025-10-04 06:00:28.748 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:29 np0005470441 podman[237789]: 2025-10-04 06:00:29.328727363 +0000 UTC m=+0.076472641 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  4 02:00:29 np0005470441 podman[237790]: 2025-10-04 06:00:29.358408587 +0000 UTC m=+0.109220303 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.vendor=CentOS)
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.748 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.780 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.781 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.782 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.782 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.933 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.935 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5750MB free_disk=73.41731643676758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.936 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:00:29 np0005470441 nova_compute[192626]: 2025-10-04 06:00:29.936 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.067 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.068 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.145 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.204 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.231 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.232 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.295s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:00:30 np0005470441 nova_compute[192626]: 2025-10-04 06:00:30.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:31 np0005470441 nova_compute[192626]: 2025-10-04 06:00:31.201 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:31 np0005470441 nova_compute[192626]: 2025-10-04 06:00:31.718 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:34 np0005470441 nova_compute[192626]: 2025-10-04 06:00:34.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:35 np0005470441 nova_compute[192626]: 2025-10-04 06:00:35.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:38 np0005470441 podman[237829]: 2025-10-04 06:00:38.307858278 +0000 UTC m=+0.055069675 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc.)
Oct  4 02:00:39 np0005470441 nova_compute[192626]: 2025-10-04 06:00:39.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:39 np0005470441 nova_compute[192626]: 2025-10-04 06:00:39.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:00:40 np0005470441 nova_compute[192626]: 2025-10-04 06:00:40.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:42 np0005470441 ovn_controller[94840]: 2025-10-04T06:00:42Z|00407|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Oct  4 02:00:43 np0005470441 podman[237850]: 2025-10-04 06:00:43.309353169 +0000 UTC m=+0.057495325 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 02:00:44 np0005470441 nova_compute[192626]: 2025-10-04 06:00:44.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:45 np0005470441 podman[237875]: 2025-10-04 06:00:45.303542521 +0000 UTC m=+0.056857167 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  4 02:00:45 np0005470441 nova_compute[192626]: 2025-10-04 06:00:45.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:49 np0005470441 nova_compute[192626]: 2025-10-04 06:00:49.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:49 np0005470441 podman[237894]: 2025-10-04 06:00:49.355023667 +0000 UTC m=+0.103636322 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  4 02:00:50 np0005470441 nova_compute[192626]: 2025-10-04 06:00:50.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:54 np0005470441 nova_compute[192626]: 2025-10-04 06:00:54.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:55 np0005470441 nova_compute[192626]: 2025-10-04 06:00:55.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:00:58 np0005470441 podman[237921]: 2025-10-04 06:00:58.293703617 +0000 UTC m=+0.046358745 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  4 02:00:58 np0005470441 podman[237920]: 2025-10-04 06:00:58.306392512 +0000 UTC m=+0.059683458 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 02:00:59 np0005470441 nova_compute[192626]: 2025-10-04 06:00:59.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:00 np0005470441 podman[237966]: 2025-10-04 06:01:00.298446882 +0000 UTC m=+0.053176130 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid)
Oct  4 02:01:00 np0005470441 podman[237967]: 2025-10-04 06:01:00.298870994 +0000 UTC m=+0.050064561 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible)
Oct  4 02:01:00 np0005470441 nova_compute[192626]: 2025-10-04 06:01:00.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.210 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.211 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.226 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.328 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.328 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.341 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.342 2 INFO nova.compute.claims [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.536 2 DEBUG nova.compute.provider_tree [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.556 2 DEBUG nova.scheduler.client.report [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.589 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.590 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.641 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.642 2 DEBUG nova.network.neutron [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.662 2 INFO nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.682 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.797 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.798 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.799 2 INFO nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Creating image(s)#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.800 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "/var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.800 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.801 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "/var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.813 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.876 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.878 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.878 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.889 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.947 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.949 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.985 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e,backing_fmt=raw /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.986 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "cd383f872a9b0b4decacf05b15cfbd33d0ac924e" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:01 np0005470441 nova_compute[192626]: 2025-10-04 06:01:01.987 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.048 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/cd383f872a9b0b4decacf05b15cfbd33d0ac924e --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.049 2 DEBUG nova.virt.disk.api [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Checking if we can resize image /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.050 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.102 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.104 2 DEBUG nova.virt.disk.api [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Cannot resize image /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.105 2 DEBUG nova.objects.instance [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'migration_context' on Instance uuid ec78ff84-cfe0-441c-b73d-20ea65af794c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.120 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.120 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Ensure instance console log exists: /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.121 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.121 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.121 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:02 np0005470441 nova_compute[192626]: 2025-10-04 06:01:02.653 2 DEBUG nova.policy [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187f315c9d1f47e18b06b24890dcb88a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3993802d0c4a44febb9b33931e51db84', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  4 02:01:04 np0005470441 nova_compute[192626]: 2025-10-04 06:01:04.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:04 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:04.387 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '6e:01:fd', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'fa:73:e1:1d:d7:2c'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 02:01:04 np0005470441 nova_compute[192626]: 2025-10-04 06:01:04.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:04 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:04.388 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  4 02:01:04 np0005470441 nova_compute[192626]: 2025-10-04 06:01:04.555 2 DEBUG nova.network.neutron [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Successfully created port: a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.493 2 DEBUG nova.network.neutron [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Successfully updated port: a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.526 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.527 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquired lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.527 2 DEBUG nova.network.neutron [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.691 2 DEBUG nova.compute.manager [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-changed-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.692 2 DEBUG nova.compute.manager [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Refreshing instance network info cache due to event network-changed-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.693 2 DEBUG oslo_concurrency.lockutils [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 02:01:05 np0005470441 nova_compute[192626]: 2025-10-04 06:01:05.726 2 DEBUG nova.network.neutron [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  4 02:01:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:06.770 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:06.770 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:06.770 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.164 2 DEBUG nova.network.neutron [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updating instance_info_cache with network_info: [{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.189 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Releasing lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.190 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Instance network_info: |[{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.191 2 DEBUG oslo_concurrency.lockutils [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.191 2 DEBUG nova.network.neutron [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Refreshing network info cache for port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.196 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Start _get_guest_xml network_info=[{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'encrypted': False, 'boot_index': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'image_id': '2b7414ad-3419-4b92-8471-b72003f69821'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.201 2 WARNING nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.207 2 DEBUG nova.virt.libvirt.host [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.208 2 DEBUG nova.virt.libvirt.host [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.212 2 DEBUG nova.virt.libvirt.host [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.213 2 DEBUG nova.virt.libvirt.host [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.215 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.215 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-04T05:30:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='9585bc8c-c7a8-4928-b67c-bb6035012f8e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-04T05:30:24Z,direct_url=<?>,disk_format='qcow2',id=2b7414ad-3419-4b92-8471-b72003f69821,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='9c10f41186584238aac60c35bb94f39a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-04T05:30:26Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.216 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.217 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.217 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.218 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.218 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.218 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.219 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.219 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.220 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.220 2 DEBUG nova.virt.hardware [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.226 2 DEBUG nova.virt.libvirt.vif [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T06:01:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1990852950',display_name='tempest-TestGettingAddress-server-1990852950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1990852950',id=57,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+DK5p2bnGFmsF5GKxXcZqWL9Pj6W21kfEqDXg1LoVn7LyL1ZMXQjuCXtTYPPtxhrcq06uQjX6tgajiL8yhu2jJVjzD/PR0SOPz17wc6KsdDXmzTlAXOYftdb04hKgNzg==',key_name='tempest-TestGettingAddress-802925117',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-p4mky9za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T06:01:01Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=ec78ff84-cfe0-441c-b73d-20ea65af794c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.227 2 DEBUG nova.network.os_vif_util [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.228 2 DEBUG nova.network.os_vif_util [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.230 2 DEBUG nova.objects.instance [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'pci_devices' on Instance uuid ec78ff84-cfe0-441c-b73d-20ea65af794c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.258 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] End _get_guest_xml xml=<domain type="kvm">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <uuid>ec78ff84-cfe0-441c-b73d-20ea65af794c</uuid>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <name>instance-00000039</name>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <memory>131072</memory>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <vcpu>1</vcpu>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <metadata>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:name>tempest-TestGettingAddress-server-1990852950</nova:name>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:creationTime>2025-10-04 06:01:07</nova:creationTime>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:flavor name="m1.nano">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:memory>128</nova:memory>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:disk>1</nova:disk>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:swap>0</nova:swap>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:ephemeral>0</nova:ephemeral>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:vcpus>1</nova:vcpus>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      </nova:flavor>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:owner>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:user uuid="187f315c9d1f47e18b06b24890dcb88a">tempest-TestGettingAddress-1483786899-project-member</nova:user>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:project uuid="3993802d0c4a44febb9b33931e51db84">tempest-TestGettingAddress-1483786899</nova:project>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      </nova:owner>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:root type="image" uuid="2b7414ad-3419-4b92-8471-b72003f69821"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <nova:ports>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        <nova:port uuid="a0f1c105-a2ee-4be2-a78c-99e06ecaeec6">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe50:2e80" ipVersion="6"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:        </nova:port>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      </nova:ports>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </nova:instance>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </metadata>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <sysinfo type="smbios">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <system>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <entry name="manufacturer">RDO</entry>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <entry name="product">OpenStack Compute</entry>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <entry name="serial">ec78ff84-cfe0-441c-b73d-20ea65af794c</entry>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <entry name="uuid">ec78ff84-cfe0-441c-b73d-20ea65af794c</entry>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <entry name="family">Virtual Machine</entry>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </system>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </sysinfo>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <os>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <boot dev="hd"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <smbios mode="sysinfo"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </os>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <features>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <acpi/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <apic/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <vmcoreinfo/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </features>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <clock offset="utc">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <timer name="pit" tickpolicy="delay"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <timer name="hpet" present="no"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </clock>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <cpu mode="custom" match="exact">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <model>Nehalem</model>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <topology sockets="1" cores="1" threads="1"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </cpu>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  <devices>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <disk type="file" device="disk">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <target dev="vda" bus="virtio"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </disk>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <disk type="file" device="cdrom">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <driver name="qemu" type="raw" cache="none"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <source file="/var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.config"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <target dev="sda" bus="sata"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </disk>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <interface type="ethernet">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <mac address="fa:16:3e:50:2e:80"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <driver name="vhost" rx_queue_size="512"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <mtu size="1442"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <target dev="tapa0f1c105-a2"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </interface>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <serial type="pty">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <log file="/var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/console.log" append="off"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </serial>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <video>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <model type="virtio"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </video>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <input type="tablet" bus="usb"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <rng model="virtio">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <backend model="random">/dev/urandom</backend>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </rng>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="pci" model="pcie-root-port"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <controller type="usb" index="0"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    <memballoon model="virtio">
Oct  4 02:01:07 np0005470441 nova_compute[192626]:      <stats period="10"/>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:    </memballoon>
Oct  4 02:01:07 np0005470441 nova_compute[192626]:  </devices>
Oct  4 02:01:07 np0005470441 nova_compute[192626]: </domain>
Oct  4 02:01:07 np0005470441 nova_compute[192626]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.261 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Preparing to wait for external event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.261 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.262 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.262 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.263 2 DEBUG nova.virt.libvirt.vif [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-04T06:01:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1990852950',display_name='tempest-TestGettingAddress-server-1990852950',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1990852950',id=57,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+DK5p2bnGFmsF5GKxXcZqWL9Pj6W21kfEqDXg1LoVn7LyL1ZMXQjuCXtTYPPtxhrcq06uQjX6tgajiL8yhu2jJVjzD/PR0SOPz17wc6KsdDXmzTlAXOYftdb04hKgNzg==',key_name='tempest-TestGettingAddress-802925117',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-p4mky9za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-04T06:01:01Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=ec78ff84-cfe0-441c-b73d-20ea65af794c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.264 2 DEBUG nova.network.os_vif_util [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.265 2 DEBUG nova.network.os_vif_util [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.266 2 DEBUG os_vif [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.267 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa0f1c105-a2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.272 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa0f1c105-a2, col_values=(('external_ids', {'iface-id': 'a0f1c105-a2ee-4be2-a78c-99e06ecaeec6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:2e:80', 'vm-uuid': 'ec78ff84-cfe0-441c-b73d-20ea65af794c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:07 np0005470441 NetworkManager[51690]: <info>  [1759557667.2756] manager: (tapa0f1c105-a2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.286 2 INFO os_vif [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2')#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.366 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.367 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.367 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] No VIF found with MAC fa:16:3e:50:2e:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.368 2 INFO nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Using config drive#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.772 2 INFO nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Creating config drive at /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.config#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.781 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjfiyf2cr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:07 np0005470441 nova_compute[192626]: 2025-10-04 06:01:07.909 2 DEBUG oslo_concurrency.processutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjfiyf2cr" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:08 np0005470441 kernel: tapa0f1c105-a2: entered promiscuous mode
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.0187] manager: (tapa0f1c105-a2): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:08Z|00408|binding|INFO|Claiming lport a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 for this chassis.
Oct  4 02:01:08 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:08Z|00409|binding|INFO|a0f1c105-a2ee-4be2-a78c-99e06ecaeec6: Claiming fa:16:3e:50:2e:80 10.100.0.14 2001:db8::f816:3eff:fe50:2e80
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.0461] manager: (patch-br-int-to-provnet-215f1097-4107-4795-be3c-03822bb23ae3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/204)
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.0480] manager: (patch-provnet-215f1097-4107-4795-be3c-03822bb23ae3-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.051 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:2e:80 10.100.0.14 2001:db8::f816:3eff:fe50:2e80'], port_security=['fa:16:3e:50:2e:80 10.100.0.14 2001:db8::f816:3eff:fe50:2e80'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8::f816:3eff:fe50:2e80/64', 'neutron:device_id': 'ec78ff84-cfe0-441c-b73d-20ea65af794c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7cf19c94-4304-4371-b395-c2514f30f6bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dfd2fab1-bdb0-4076-b40a-db33bfd4994b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08a3064d-ad51-4659-9426-9641fbf843fd, chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.053 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 in datapath 7cf19c94-4304-4371-b395-c2514f30f6bc bound to our chassis#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.054 103689 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7cf19c94-4304-4371-b395-c2514f30f6bc#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.070 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d40f64d3-f3a8-4935-8d97-420f15723c22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.071 103689 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7cf19c94-41 in ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  4 02:01:08 np0005470441 systemd-machined[152624]: New machine qemu-31-instance-00000039.
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.075 220349 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7cf19c94-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.075 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[fb7525ba-f200-4446-8811-16e73c969b1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.077 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d915c812-dca7-444b-a099-f06f53dfcd15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 systemd-udevd[238052]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.096 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4bee93-2d89-44c3-b794-eb4db80573ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.1045] device (tapa0f1c105-a2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.1056] device (tapa0f1c105-a2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.127 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[aa0266d1-b83b-4173-9c1d-cb4f4acfb07b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 systemd[1]: Started Virtual Machine qemu-31-instance-00000039.
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:08Z|00410|binding|INFO|Setting lport a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 ovn-installed in OVS
Oct  4 02:01:08 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:08Z|00411|binding|INFO|Setting lport a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 up in Southbound
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.169 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[7aef597c-c217-4499-9cfb-f639bf02f7c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.175 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea6ad69-d65c-47b2-b738-10bdeaffb274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.1763] manager: (tap7cf19c94-40): new Veth device (/org/freedesktop/NetworkManager/Devices/206)
Oct  4 02:01:08 np0005470441 systemd-udevd[238056]: Network interface NamePolicy= disabled on kernel command line.
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.213 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[f455aa16-bac8-4f7d-a959-8c70d3459138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.217 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[c90c16c2-7978-48bd-8df5-63cab4c7ddd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.2430] device (tap7cf19c94-40): carrier: link connected
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.251 220363 DEBUG oslo.privsep.daemon [-] privsep: reply[852aec45-1052-4a8f-a166-d47788566cef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.277 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[450b3bbb-f17d-4be2-9718-0f39fa073416]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7cf19c94-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:4c:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556589, 'reachable_time': 25411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238084, 'error': None, 'target': 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.301 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[27a20f53-8a39-44e0-ad59-ad8c6941d07a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:4cb9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556589, 'tstamp': 556589}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 238085, 'error': None, 'target': 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.325 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[ac605a0f-1b39-48bf-bf8b-76ad53a1ffa7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7cf19c94-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:4c:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556589, 'reachable_time': 25411, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 238086, 'error': None, 'target': 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.366 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e23b7bb6-feab-4d09-9413-bdf027800b2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.440 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[c9ddcd7b-999f-4025-b271-53c5d49c30eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.441 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cf19c94-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.442 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.442 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7cf19c94-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 kernel: tap7cf19c94-40: entered promiscuous mode
Oct  4 02:01:08 np0005470441 NetworkManager[51690]: <info>  [1759557668.4454] manager: (tap7cf19c94-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/207)
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.448 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7cf19c94-40, col_values=(('external_ids', {'iface-id': '191b7d1e-368e-4ed6-9fd7-fb99ed92e607'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:08 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:08Z|00412|binding|INFO|Releasing lport 191b7d1e-368e-4ed6-9fd7-fb99ed92e607 from this chassis (sb_readonly=0)
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.476 103689 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7cf19c94-4304-4371-b395-c2514f30f6bc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7cf19c94-4304-4371-b395-c2514f30f6bc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.477 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e00a47fa-0843-4075-b126-1a907ecf7a62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.478 103689 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: global
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    log         /dev/log local0 debug
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    log-tag     haproxy-metadata-proxy-7cf19c94-4304-4371-b395-c2514f30f6bc
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    user        root
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    group       root
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    maxconn     1024
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    pidfile     /var/lib/neutron/external/pids/7cf19c94-4304-4371-b395-c2514f30f6bc.pid.haproxy
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    daemon
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: defaults
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    log global
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    mode http
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    option httplog
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    option dontlognull
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    option http-server-close
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    option forwardfor
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    retries                 3
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    timeout http-request    30s
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    timeout connect         30s
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    timeout client          32s
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    timeout server          32s
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    timeout http-keep-alive 30s
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: listen listener
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    bind 169.254.169.254:80
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    server metadata /var/lib/neutron/metadata_proxy
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]:    http-request add-header X-OVN-Network-ID 7cf19c94-4304-4371-b395-c2514f30f6bc
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  4 02:01:08 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:08.478 103689 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'env', 'PROCESS_TAG=haproxy-7cf19c94-4304-4371-b395-c2514f30f6bc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7cf19c94-4304-4371-b395-c2514f30f6bc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.532 2 DEBUG nova.compute.manager [req-57247eaa-43d3-4aaa-87ad-653bf2becb83 req-04d23784-0d97-4f0a-8c80-acfb092c8efc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.535 2 DEBUG oslo_concurrency.lockutils [req-57247eaa-43d3-4aaa-87ad-653bf2becb83 req-04d23784-0d97-4f0a-8c80-acfb092c8efc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.535 2 DEBUG oslo_concurrency.lockutils [req-57247eaa-43d3-4aaa-87ad-653bf2becb83 req-04d23784-0d97-4f0a-8c80-acfb092c8efc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.536 2 DEBUG oslo_concurrency.lockutils [req-57247eaa-43d3-4aaa-87ad-653bf2becb83 req-04d23784-0d97-4f0a-8c80-acfb092c8efc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:08 np0005470441 nova_compute[192626]: 2025-10-04 06:01:08.536 2 DEBUG nova.compute.manager [req-57247eaa-43d3-4aaa-87ad-653bf2becb83 req-04d23784-0d97-4f0a-8c80-acfb092c8efc 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Processing event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  4 02:01:08 np0005470441 podman[238126]: 2025-10-04 06:01:08.985566788 +0000 UTC m=+0.106106004 container create 7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  4 02:01:08 np0005470441 podman[238126]: 2025-10-04 06:01:08.902280942 +0000 UTC m=+0.022820168 image pull bbbfce09fcb93ace1ccb7e517f362406dd2731dee6975690bde9d5d28392a1b7 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.007 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557669.0070555, ec78ff84-cfe0-441c-b73d-20ea65af794c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.008 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] VM Started (Lifecycle Event)#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.011 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.014 2 DEBUG nova.network.neutron [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updated VIF entry in instance network info cache for port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.015 2 DEBUG nova.network.neutron [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updating instance_info_cache with network_info: [{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.016 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.021 2 INFO nova.virt.libvirt.driver [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Instance spawned successfully.#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.021 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.048 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.049 2 DEBUG oslo_concurrency.lockutils [req-368d9030-2701-4b25-8fa7-ec1f06c81488 req-7b02e1b7-027b-446e-bfed-ec624981d8fa 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 02:01:09 np0005470441 systemd[1]: Started libpod-conmon-7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16.scope.
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.055 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.060 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.060 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.061 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.062 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.063 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.064 2 DEBUG nova.virt.libvirt.driver [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.076 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.076 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557669.0106354, ec78ff84-cfe0-441c-b73d-20ea65af794c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.077 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] VM Paused (Lifecycle Event)#033[00m
Oct  4 02:01:09 np0005470441 systemd[1]: Started libcrun container.
Oct  4 02:01:09 np0005470441 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36e8f324076c7a5d539f47d44c908a474c55275409109214f1308d05dc59bcb7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.101 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.104 2 DEBUG nova.virt.driver [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] Emitting event <LifecycleEvent: 1759557669.0149994, ec78ff84-cfe0-441c-b73d-20ea65af794c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.104 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] VM Resumed (Lifecycle Event)#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.129 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.132 2 DEBUG nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.135 2 INFO nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Took 7.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.136 2 DEBUG nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 02:01:09 np0005470441 podman[238126]: 2025-10-04 06:01:09.139883146 +0000 UTC m=+0.260422402 container init 7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 02:01:09 np0005470441 podman[238139]: 2025-10-04 06:01:09.142994046 +0000 UTC m=+0.123668448 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Oct  4 02:01:09 np0005470441 podman[238126]: 2025-10-04 06:01:09.146088575 +0000 UTC m=+0.266627781 container start 7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 02:01:09 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [NOTICE]   (238166) : New worker (238168) forked
Oct  4 02:01:09 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [NOTICE]   (238166) : Loading success.
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.170 2 INFO nova.compute.manager [None req-440d1ecc-a397-4e45-8c51-ee5cac92f470 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.218 2 INFO nova.compute.manager [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Took 7.93 seconds to build instance.#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.240 2 DEBUG oslo_concurrency.lockutils [None req-533d198a-36f4-4a3f-a799-c59e21da74b3 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:09 np0005470441 nova_compute[192626]: 2025-10-04 06:01:09.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:10 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:10.391 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=9c4f1832-26e9-4f83-989c-c9b104eab4b1, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:10 np0005470441 nova_compute[192626]: 2025-10-04 06:01:10.658 2 DEBUG nova.compute.manager [req-eaa693f4-2bbc-4e09-a8e8-bd80b2bfe1a2 req-3276aacb-7512-44d3-8bb2-d76cfc9eda6f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:10 np0005470441 nova_compute[192626]: 2025-10-04 06:01:10.658 2 DEBUG oslo_concurrency.lockutils [req-eaa693f4-2bbc-4e09-a8e8-bd80b2bfe1a2 req-3276aacb-7512-44d3-8bb2-d76cfc9eda6f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:10 np0005470441 nova_compute[192626]: 2025-10-04 06:01:10.659 2 DEBUG oslo_concurrency.lockutils [req-eaa693f4-2bbc-4e09-a8e8-bd80b2bfe1a2 req-3276aacb-7512-44d3-8bb2-d76cfc9eda6f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:10 np0005470441 nova_compute[192626]: 2025-10-04 06:01:10.659 2 DEBUG oslo_concurrency.lockutils [req-eaa693f4-2bbc-4e09-a8e8-bd80b2bfe1a2 req-3276aacb-7512-44d3-8bb2-d76cfc9eda6f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:10 np0005470441 nova_compute[192626]: 2025-10-04 06:01:10.659 2 DEBUG nova.compute.manager [req-eaa693f4-2bbc-4e09-a8e8-bd80b2bfe1a2 req-3276aacb-7512-44d3-8bb2-d76cfc9eda6f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] No waiting events found dispatching network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 02:01:10 np0005470441 nova_compute[192626]: 2025-10-04 06:01:10.660 2 WARNING nova.compute.manager [req-eaa693f4-2bbc-4e09-a8e8-bd80b2bfe1a2 req-3276aacb-7512-44d3-8bb2-d76cfc9eda6f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received unexpected event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 for instance with vm_state active and task_state None.#033[00m
Oct  4 02:01:12 np0005470441 nova_compute[192626]: 2025-10-04 06:01:12.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:13 np0005470441 nova_compute[192626]: 2025-10-04 06:01:13.743 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:14 np0005470441 podman[238177]: 2025-10-04 06:01:14.297233892 +0000 UTC m=+0.051748249 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  4 02:01:14 np0005470441 nova_compute[192626]: 2025-10-04 06:01:14.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:14 np0005470441 nova_compute[192626]: 2025-10-04 06:01:14.676 2 DEBUG nova.compute.manager [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-changed-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:14 np0005470441 nova_compute[192626]: 2025-10-04 06:01:14.676 2 DEBUG nova.compute.manager [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Refreshing instance network info cache due to event network-changed-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 02:01:14 np0005470441 nova_compute[192626]: 2025-10-04 06:01:14.676 2 DEBUG oslo_concurrency.lockutils [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 02:01:14 np0005470441 nova_compute[192626]: 2025-10-04 06:01:14.676 2 DEBUG oslo_concurrency.lockutils [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 02:01:14 np0005470441 nova_compute[192626]: 2025-10-04 06:01:14.677 2 DEBUG nova.network.neutron [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Refreshing network info cache for port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 02:01:15 np0005470441 nova_compute[192626]: 2025-10-04 06:01:15.711 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:16 np0005470441 podman[238201]: 2025-10-04 06:01:16.312009975 +0000 UTC m=+0.059894364 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 02:01:17 np0005470441 nova_compute[192626]: 2025-10-04 06:01:17.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:17 np0005470441 nova_compute[192626]: 2025-10-04 06:01:17.623 2 DEBUG nova.network.neutron [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updated VIF entry in instance network info cache for port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 02:01:17 np0005470441 nova_compute[192626]: 2025-10-04 06:01:17.624 2 DEBUG nova.network.neutron [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updating instance_info_cache with network_info: [{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 02:01:17 np0005470441 nova_compute[192626]: 2025-10-04 06:01:17.650 2 DEBUG oslo_concurrency.lockutils [req-3df92dfa-8e93-4e99-952e-ad5ad0214b8e req-7250090e-7947-47b3-9962-7d2fbf1079a4 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 02:01:19 np0005470441 nova_compute[192626]: 2025-10-04 06:01:19.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:19 np0005470441 nova_compute[192626]: 2025-10-04 06:01:19.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:20 np0005470441 podman[238234]: 2025-10-04 06:01:20.334374274 +0000 UTC m=+0.081081103 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  4 02:01:22 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:22Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:50:2e:80 10.100.0.14
Oct  4 02:01:22 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:22Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:50:2e:80 10.100.0.14
Oct  4 02:01:22 np0005470441 nova_compute[192626]: 2025-10-04 06:01:22.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:23 np0005470441 nova_compute[192626]: 2025-10-04 06:01:23.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:23 np0005470441 nova_compute[192626]: 2025-10-04 06:01:23.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 02:01:24 np0005470441 nova_compute[192626]: 2025-10-04 06:01:24.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:27 np0005470441 nova_compute[192626]: 2025-10-04 06:01:27.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:27 np0005470441 nova_compute[192626]: 2025-10-04 06:01:27.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:28 np0005470441 nova_compute[192626]: 2025-10-04 06:01:28.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:28 np0005470441 nova_compute[192626]: 2025-10-04 06:01:28.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 02:01:28 np0005470441 nova_compute[192626]: 2025-10-04 06:01:28.717 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 02:01:29 np0005470441 nova_compute[192626]: 2025-10-04 06:01:29.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:29 np0005470441 podman[238261]: 2025-10-04 06:01:29.347794612 +0000 UTC m=+0.091366529 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  4 02:01:29 np0005470441 podman[238262]: 2025-10-04 06:01:29.349600254 +0000 UTC m=+0.082479603 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 02:01:29 np0005470441 nova_compute[192626]: 2025-10-04 06:01:29.594 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 02:01:29 np0005470441 nova_compute[192626]: 2025-10-04 06:01:29.594 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquired lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 02:01:29 np0005470441 nova_compute[192626]: 2025-10-04 06:01:29.594 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  4 02:01:29 np0005470441 nova_compute[192626]: 2025-10-04 06:01:29.594 2 DEBUG nova.objects.instance [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ec78ff84-cfe0-441c-b73d-20ea65af794c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 02:01:31 np0005470441 podman[238304]: 2025-10-04 06:01:31.323388829 +0000 UTC m=+0.074036921 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid)
Oct  4 02:01:31 np0005470441 podman[238305]: 2025-10-04 06:01:31.346754731 +0000 UTC m=+0.092138361 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.624 2 DEBUG nova.network.neutron [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updating instance_info_cache with network_info: [{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.642 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Releasing lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.642 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.642 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.665 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.665 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.666 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.666 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.731 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.828 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.829 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  4 02:01:31 np0005470441 nova_compute[192626]: 2025-10-04 06:01:31.884 2 DEBUG oslo_concurrency.processutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.043 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.044 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5580MB free_disk=73.38847732543945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.044 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.044 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.103 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Instance ec78ff84-cfe0-441c-b73d-20ea65af794c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.103 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.104 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.150 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.165 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.186 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.186 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.260 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.292 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.292 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.724 2 DEBUG nova.compute.manager [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-changed-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.724 2 DEBUG nova.compute.manager [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Refreshing instance network info cache due to event network-changed-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.725 2 DEBUG oslo_concurrency.lockutils [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.725 2 DEBUG oslo_concurrency.lockutils [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquired lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.726 2 DEBUG nova.network.neutron [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Refreshing network info cache for port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.842 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.843 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.843 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.844 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.844 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.846 2 INFO nova.compute.manager [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Terminating instance#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.848 2 DEBUG nova.compute.manager [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  4 02:01:32 np0005470441 kernel: tapa0f1c105-a2 (unregistering): left promiscuous mode
Oct  4 02:01:32 np0005470441 NetworkManager[51690]: <info>  [1759557692.8725] device (tapa0f1c105-a2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  4 02:01:32 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:32Z|00413|binding|INFO|Releasing lport a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 from this chassis (sb_readonly=0)
Oct  4 02:01:32 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:32Z|00414|binding|INFO|Setting lport a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 down in Southbound
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.884 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:32 np0005470441 ovn_controller[94840]: 2025-10-04T06:01:32Z|00415|binding|INFO|Removing iface tapa0f1c105-a2 ovn-installed in OVS
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.887 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:32.894 103689 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:2e:80 10.100.0.14 2001:db8::f816:3eff:fe50:2e80'], port_security=['fa:16:3e:50:2e:80 10.100.0.14 2001:db8::f816:3eff:fe50:2e80'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28 2001:db8::f816:3eff:fe50:2e80/64', 'neutron:device_id': 'ec78ff84-cfe0-441c-b73d-20ea65af794c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7cf19c94-4304-4371-b395-c2514f30f6bc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3993802d0c4a44febb9b33931e51db84', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dfd2fab1-bdb0-4076-b40a-db33bfd4994b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08a3064d-ad51-4659-9426-9641fbf843fd, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>], logical_port=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f5e01019640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  4 02:01:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:32.897 103689 INFO neutron.agent.ovn.metadata.agent [-] Port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 in datapath 7cf19c94-4304-4371-b395-c2514f30f6bc unbound from our chassis#033[00m
Oct  4 02:01:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:32.899 103689 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7cf19c94-4304-4371-b395-c2514f30f6bc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  4 02:01:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:32.901 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d9033f15-e267-4223-986f-e5cb6c64d112]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:32 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:32.902 103689 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc namespace which is not needed anymore#033[00m
Oct  4 02:01:32 np0005470441 nova_compute[192626]: 2025-10-04 06:01:32.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:32 np0005470441 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000039.scope: Deactivated successfully.
Oct  4 02:01:32 np0005470441 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d00000039.scope: Consumed 13.440s CPU time.
Oct  4 02:01:32 np0005470441 systemd-machined[152624]: Machine qemu-31-instance-00000039 terminated.
Oct  4 02:01:33 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [NOTICE]   (238166) : haproxy version is 2.8.14-c23fe91
Oct  4 02:01:33 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [NOTICE]   (238166) : path to executable is /usr/sbin/haproxy
Oct  4 02:01:33 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [WARNING]  (238166) : Exiting Master process...
Oct  4 02:01:33 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [ALERT]    (238166) : Current worker (238168) exited with code 143 (Terminated)
Oct  4 02:01:33 np0005470441 neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc[238158]: [WARNING]  (238166) : All workers exited. Exiting... (0)
Oct  4 02:01:33 np0005470441 systemd[1]: libpod-7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16.scope: Deactivated successfully.
Oct  4 02:01:33 np0005470441 podman[238377]: 2025-10-04 06:01:33.053145663 +0000 UTC m=+0.048777594 container died 7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  4 02:01:33 np0005470441 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16-userdata-shm.mount: Deactivated successfully.
Oct  4 02:01:33 np0005470441 systemd[1]: var-lib-containers-storage-overlay-36e8f324076c7a5d539f47d44c908a474c55275409109214f1308d05dc59bcb7-merged.mount: Deactivated successfully.
Oct  4 02:01:33 np0005470441 podman[238377]: 2025-10-04 06:01:33.099844326 +0000 UTC m=+0.095476237 container cleanup 7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, org.label-schema.license=GPLv2)
Oct  4 02:01:33 np0005470441 systemd[1]: libpod-conmon-7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16.scope: Deactivated successfully.
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.105 2 INFO nova.virt.libvirt.driver [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Instance destroyed successfully.#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.106 2 DEBUG nova.objects.instance [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lazy-loading 'resources' on Instance uuid ec78ff84-cfe0-441c-b73d-20ea65af794c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.120 2 DEBUG nova.virt.libvirt.vif [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-04T06:01:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1990852950',display_name='tempest-TestGettingAddress-server-1990852950',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1990852950',id=57,image_ref='2b7414ad-3419-4b92-8471-b72003f69821',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBM+DK5p2bnGFmsF5GKxXcZqWL9Pj6W21kfEqDXg1LoVn7LyL1ZMXQjuCXtTYPPtxhrcq06uQjX6tgajiL8yhu2jJVjzD/PR0SOPz17wc6KsdDXmzTlAXOYftdb04hKgNzg==',key_name='tempest-TestGettingAddress-802925117',keypairs=<?>,launch_index=0,launched_at=2025-10-04T06:01:09Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3993802d0c4a44febb9b33931e51db84',ramdisk_id='',reservation_id='r-p4mky9za',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='2b7414ad-3419-4b92-8471-b72003f69821',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-1483786899',owner_user_name='tempest-TestGettingAddress-1483786899-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-04T06:01:09Z,user_data=None,user_id='187f315c9d1f47e18b06b24890dcb88a',uuid=ec78ff84-cfe0-441c-b73d-20ea65af794c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.121 2 DEBUG nova.network.os_vif_util [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converting VIF {"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.121 2 DEBUG nova.network.os_vif_util [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.122 2 DEBUG os_vif [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.124 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa0f1c105-a2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.129 2 INFO os_vif [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:2e:80,bridge_name='br-int',has_traffic_filtering=True,id=a0f1c105-a2ee-4be2-a78c-99e06ecaeec6,network=Network(7cf19c94-4304-4371-b395-c2514f30f6bc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa0f1c105-a2')#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.129 2 INFO nova.virt.libvirt.driver [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Deleting instance files /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c_del#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.130 2 INFO nova.virt.libvirt.driver [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Deletion of /var/lib/nova/instances/ec78ff84-cfe0-441c-b73d-20ea65af794c_del complete#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.170 2 INFO nova.compute.manager [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.171 2 DEBUG oslo.service.loopingcall [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.171 2 DEBUG nova.compute.manager [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.171 2 DEBUG nova.network.neutron [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  4 02:01:33 np0005470441 podman[238421]: 2025-10-04 06:01:33.176548623 +0000 UTC m=+0.048542288 container remove 7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.182 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[3a9c6fb8-18bc-4ee7-8797-777a9a050e98]: (4, ('Sat Oct  4 06:01:32 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc (7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16)\n7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16\nSat Oct  4 06:01:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc (7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16)\n7a7535b1c4c555aa24c01d0668f4db12260793ebe1b2a421f6001987a5177e16\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.184 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[b651ae9c-68b3-4855-9200-d828fe428450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.185 103689 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7cf19c94-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:33 np0005470441 kernel: tap7cf19c94-40: left promiscuous mode
Oct  4 02:01:33 np0005470441 nova_compute[192626]: 2025-10-04 06:01:33.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.205 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[1f1ddda0-92eb-4943-b47d-ae561132302c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.235 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[16ce718a-988c-40f5-867d-4e1cb0defd25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.237 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f13fd8-a84b-421e-8758-748e594578a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.252 220349 DEBUG oslo.privsep.daemon [-] privsep: reply[d2939798-f49a-484a-ae68-87f5baf6e881]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556582, 'reachable_time': 18002, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238436, 'error': None, 'target': 'ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.255 103801 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7cf19c94-4304-4371-b395-c2514f30f6bc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  4 02:01:33 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:01:33.255 103801 DEBUG oslo.privsep.daemon [-] privsep: reply[0a5885ac-399f-4876-a7ff-d1445c45015c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  4 02:01:33 np0005470441 systemd[1]: run-netns-ovnmeta\x2d7cf19c94\x2d4304\x2d4371\x2db395\x2dc2514f30f6bc.mount: Deactivated successfully.
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.625 2 DEBUG nova.network.neutron [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.651 2 INFO nova.compute.manager [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Took 1.48 seconds to deallocate network for instance.#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.763 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.763 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.811 2 DEBUG nova.compute.provider_tree [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.831 2 DEBUG nova.scheduler.client.report [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.854 2 DEBUG nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-vif-unplugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.854 2 DEBUG oslo_concurrency.lockutils [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.855 2 DEBUG oslo_concurrency.lockutils [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.855 2 DEBUG oslo_concurrency.lockutils [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.856 2 DEBUG nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] No waiting events found dispatching network-vif-unplugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.856 2 WARNING nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received unexpected event network-vif-unplugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.857 2 DEBUG nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.857 2 DEBUG oslo_concurrency.lockutils [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Acquiring lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.857 2 DEBUG oslo_concurrency.lockutils [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.858 2 DEBUG oslo_concurrency.lockutils [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.858 2 DEBUG nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] No waiting events found dispatching network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.859 2 WARNING nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received unexpected event network-vif-plugged-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 for instance with vm_state deleted and task_state None.#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.859 2 DEBUG nova.compute.manager [req-46e7da39-9bff-431b-ac51-e3791f3e9331 req-70b8543e-dc6a-498c-a43e-52ced48ea93f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Received event network-vif-deleted-a0f1c105-a2ee-4be2-a78c-99e06ecaeec6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.861 2 DEBUG nova.network.neutron [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updated VIF entry in instance network info cache for port a0f1c105-a2ee-4be2-a78c-99e06ecaeec6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.861 2 DEBUG nova.network.neutron [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Updating instance_info_cache with network_info: [{"id": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "address": "fa:16:3e:50:2e:80", "network": {"id": "7cf19c94-4304-4371-b395-c2514f30f6bc", "bridge": "br-int", "label": "tempest-network-smoke--603503766", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe50:2e80", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3993802d0c4a44febb9b33931e51db84", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa0f1c105-a2", "ovs_interfaceid": "a0f1c105-a2ee-4be2-a78c-99e06ecaeec6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.866 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.883 2 DEBUG oslo_concurrency.lockutils [req-dcc2e529-6933-42e0-b840-77b08365937f req-245b030b-bdec-4b7b-82bc-7b249cd87c3f 69b0393e1aed49858ad1b15f7e3fdb47 312dbf8125524929a70108268b6d8e72 - - default default] Releasing lock "refresh_cache-ec78ff84-cfe0-441c-b73d-20ea65af794c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.900 2 INFO nova.scheduler.client.report [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Deleted allocations for instance ec78ff84-cfe0-441c-b73d-20ea65af794c#033[00m
Oct  4 02:01:34 np0005470441 nova_compute[192626]: 2025-10-04 06:01:34.969 2 DEBUG oslo_concurrency.lockutils [None req-c71a14db-0741-43b5-b94d-e2e964c60c96 187f315c9d1f47e18b06b24890dcb88a 3993802d0c4a44febb9b33931e51db84 - - default default] Lock "ec78ff84-cfe0-441c-b73d-20ea65af794c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:01:38 np0005470441 nova_compute[192626]: 2025-10-04 06:01:38.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:39 np0005470441 nova_compute[192626]: 2025-10-04 06:01:39.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:39 np0005470441 podman[238441]: 2025-10-04 06:01:39.375758247 +0000 UTC m=+0.113968869 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, vcs-type=git, maintainer=Red Hat, Inc.)
Oct  4 02:01:41 np0005470441 nova_compute[192626]: 2025-10-04 06:01:41.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:41 np0005470441 nova_compute[192626]: 2025-10-04 06:01:41.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:43 np0005470441 nova_compute[192626]: 2025-10-04 06:01:43.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:44 np0005470441 nova_compute[192626]: 2025-10-04 06:01:44.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:45 np0005470441 podman[238465]: 2025-10-04 06:01:45.333982728 +0000 UTC m=+0.088538037 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  4 02:01:47 np0005470441 podman[238489]: 2025-10-04 06:01:47.335094568 +0000 UTC m=+0.079642482 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  4 02:01:48 np0005470441 nova_compute[192626]: 2025-10-04 06:01:48.105 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759557693.1043637, ec78ff84-cfe0-441c-b73d-20ea65af794c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  4 02:01:48 np0005470441 nova_compute[192626]: 2025-10-04 06:01:48.105 2 INFO nova.compute.manager [-] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] VM Stopped (Lifecycle Event)#033[00m
Oct  4 02:01:48 np0005470441 nova_compute[192626]: 2025-10-04 06:01:48.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:48 np0005470441 nova_compute[192626]: 2025-10-04 06:01:48.152 2 DEBUG nova.compute.manager [None req-73bef96d-02d1-493d-8ffb-c0dd64c0a263 - - - - - -] [instance: ec78ff84-cfe0-441c-b73d-20ea65af794c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  4 02:01:49 np0005470441 nova_compute[192626]: 2025-10-04 06:01:49.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:51 np0005470441 podman[238511]: 2025-10-04 06:01:51.334818486 +0000 UTC m=+0.084554723 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac)
Oct  4 02:01:53 np0005470441 nova_compute[192626]: 2025-10-04 06:01:53.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:54 np0005470441 nova_compute[192626]: 2025-10-04 06:01:54.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:58 np0005470441 nova_compute[192626]: 2025-10-04 06:01:58.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:01:59 np0005470441 nova_compute[192626]: 2025-10-04 06:01:59.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:00 np0005470441 podman[238541]: 2025-10-04 06:02:00.291320311 +0000 UTC m=+0.046417907 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  4 02:02:00 np0005470441 podman[238540]: 2025-10-04 06:02:00.303442729 +0000 UTC m=+0.061839160 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  4 02:02:01 np0005470441 podman[238580]: 2025-10-04 06:02:01.841446487 +0000 UTC m=+0.057815264 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  4 02:02:01 np0005470441 podman[238581]: 2025-10-04 06:02:01.876477785 +0000 UTC m=+0.078192800 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.714 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.715 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:02 np0005470441 ceilometer_agent_compute[203402]: 2025-10-04 06:02:02.716 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  4 02:02:03 np0005470441 nova_compute[192626]: 2025-10-04 06:02:03.146 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:04 np0005470441 nova_compute[192626]: 2025-10-04 06:02:04.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:02:06.771 103689 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:02:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:02:06.772 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:02:06 np0005470441 ovn_metadata_agent[103684]: 2025-10-04 06:02:06.772 103689 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:02:08 np0005470441 nova_compute[192626]: 2025-10-04 06:02:08.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:09 np0005470441 nova_compute[192626]: 2025-10-04 06:02:09.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:09 np0005470441 podman[238618]: 2025-10-04 06:02:09.679596405 +0000 UTC m=+0.071662332 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6)
Oct  4 02:02:12 np0005470441 ovn_controller[94840]: 2025-10-04T06:02:12Z|00416|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct  4 02:02:13 np0005470441 nova_compute[192626]: 2025-10-04 06:02:13.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:14 np0005470441 nova_compute[192626]: 2025-10-04 06:02:14.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:14 np0005470441 nova_compute[192626]: 2025-10-04 06:02:14.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:15 np0005470441 nova_compute[192626]: 2025-10-04 06:02:15.712 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:16 np0005470441 podman[238639]: 2025-10-04 06:02:16.354382217 +0000 UTC m=+0.100838561 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  4 02:02:18 np0005470441 nova_compute[192626]: 2025-10-04 06:02:18.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:18 np0005470441 podman[238663]: 2025-10-04 06:02:18.320290486 +0000 UTC m=+0.066266763 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  4 02:02:19 np0005470441 nova_compute[192626]: 2025-10-04 06:02:19.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:20 np0005470441 nova_compute[192626]: 2025-10-04 06:02:20.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:22 np0005470441 podman[238686]: 2025-10-04 06:02:22.327396509 +0000 UTC m=+0.078362896 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  4 02:02:23 np0005470441 nova_compute[192626]: 2025-10-04 06:02:23.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:24 np0005470441 nova_compute[192626]: 2025-10-04 06:02:24.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:24 np0005470441 nova_compute[192626]: 2025-10-04 06:02:24.715 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:24 np0005470441 nova_compute[192626]: 2025-10-04 06:02:24.716 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  4 02:02:27 np0005470441 nova_compute[192626]: 2025-10-04 06:02:27.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:28 np0005470441 nova_compute[192626]: 2025-10-04 06:02:28.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:29 np0005470441 nova_compute[192626]: 2025-10-04 06:02:29.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:29 np0005470441 nova_compute[192626]: 2025-10-04 06:02:29.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:29 np0005470441 nova_compute[192626]: 2025-10-04 06:02:29.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  4 02:02:29 np0005470441 nova_compute[192626]: 2025-10-04 06:02:29.718 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  4 02:02:29 np0005470441 nova_compute[192626]: 2025-10-04 06:02:29.740 2 DEBUG nova.compute.manager [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.716 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.749 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.750 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.751 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.751 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  4 02:02:30 np0005470441 podman[238715]: 2025-10-04 06:02:30.895356836 +0000 UTC m=+0.084739497 container health_status 3ee0eebfc7b474d477280caa6942ba21e343ec8e8b52d1bbd635f6467d5679f8 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Oct  4 02:02:30 np0005470441 podman[238716]: 2025-10-04 06:02:30.909325593 +0000 UTC m=+0.095965536 container health_status 46a37d2b110b3311dbe517d1a99274c9adfd9861792e2da8ae59310438f33112 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.938 2 WARNING nova.virt.libvirt.driver [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.938 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.41732025146484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.939 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  4 02:02:30 np0005470441 nova_compute[192626]: 2025-10-04 06:02:30.939 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.002 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.003 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.018 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing inventories for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.038 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating ProviderTree inventory for provider 4baba3a8-b392-49ca-9421-92d7b50a939b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.038 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Updating inventory in ProviderTree for provider 4baba3a8-b392-49ca-9421-92d7b50a939b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.055 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing aggregate associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.075 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Refreshing trait associations for resource provider 4baba3a8-b392-49ca-9421-92d7b50a939b, traits: COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_SATA,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_STORAGE_BUS_USB _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.094 2 DEBUG nova.compute.provider_tree [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed in ProviderTree for provider: 4baba3a8-b392-49ca-9421-92d7b50a939b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.106 2 DEBUG nova.scheduler.client.report [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Inventory has not changed for provider 4baba3a8-b392-49ca-9421-92d7b50a939b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.129 2 DEBUG nova.compute.resource_tracker [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  4 02:02:31 np0005470441 nova_compute[192626]: 2025-10-04 06:02:31.129 2 DEBUG oslo_concurrency.lockutils [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  4 02:02:32 np0005470441 podman[238760]: 2025-10-04 06:02:32.347958849 +0000 UTC m=+0.095340798 container health_status a43dac997c3e8cadb19bbae2d19483de9ce6156bca93054d19b080216c4a3f26 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  4 02:02:32 np0005470441 podman[238759]: 2025-10-04 06:02:32.360000921 +0000 UTC m=+0.102943984 container health_status 9bf61933624e50b9d49b6765694a58a5d4a73a5631f1e0e670acb306005f41aa (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid)
Oct  4 02:02:33 np0005470441 nova_compute[192626]: 2025-10-04 06:02:33.129 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:33 np0005470441 nova_compute[192626]: 2025-10-04 06:02:33.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:33 np0005470441 nova_compute[192626]: 2025-10-04 06:02:33.717 2 DEBUG oslo_service.periodic_task [None req-789aee13-8def-4d56-b696-f2e78055b729 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  4 02:02:34 np0005470441 nova_compute[192626]: 2025-10-04 06:02:34.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:38 np0005470441 nova_compute[192626]: 2025-10-04 06:02:38.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:39 np0005470441 nova_compute[192626]: 2025-10-04 06:02:39.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:40 np0005470441 podman[238798]: 2025-10-04 06:02:40.322155158 +0000 UTC m=+0.065330536 container health_status 0720f6fcb7c1ac3ad1e7800188478f26ff2cc91040be3c27c6df75776c42bbd6 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_id=edpm, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  4 02:02:43 np0005470441 nova_compute[192626]: 2025-10-04 06:02:43.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:44 np0005470441 nova_compute[192626]: 2025-10-04 06:02:44.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:47 np0005470441 podman[238819]: 2025-10-04 06:02:47.318894386 +0000 UTC m=+0.063428901 container health_status 69802beeae07c5662520d446f68805bb3a3114a4c5b82e58906a373e9e77ea7e (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  4 02:02:48 np0005470441 nova_compute[192626]: 2025-10-04 06:02:48.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:48 np0005470441 systemd-logind[796]: New session 30 of user zuul.
Oct  4 02:02:48 np0005470441 systemd[1]: Started Session 30 of User zuul.
Oct  4 02:02:48 np0005470441 podman[238846]: 2025-10-04 06:02:48.453960733 +0000 UTC m=+0.090019277 container health_status 60ad72c42a5894f0501510b18eae46fce247782ea61a14091143f139c872cce3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true)
Oct  4 02:02:49 np0005470441 nova_compute[192626]: 2025-10-04 06:02:49.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:53 np0005470441 ovs-vsctl[239037]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  4 02:02:53 np0005470441 nova_compute[192626]: 2025-10-04 06:02:53.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:53 np0005470441 podman[239044]: 2025-10-04 06:02:53.385181314 +0000 UTC m=+0.129165848 container health_status 9d0bff7caee03e043238bfb13cb1cddc71ac1a5916dfd34d895b5c8250519d15 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=88dc57612f447daadb492dcf3ad854ac, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  4 02:02:53 np0005470441 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 238890 (sos)
Oct  4 02:02:53 np0005470441 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  4 02:02:53 np0005470441 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  4 02:02:54 np0005470441 virtqemud[192168]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  4 02:02:54 np0005470441 virtqemud[192168]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  4 02:02:54 np0005470441 virtqemud[192168]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  4 02:02:54 np0005470441 nova_compute[192626]: 2025-10-04 06:02:54.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:55 np0005470441 kernel: block sr0: the capability attribute has been deprecated.
Oct  4 02:02:57 np0005470441 systemd[1]: Starting Hostname Service...
Oct  4 02:02:57 np0005470441 systemd[1]: Started Hostname Service.
Oct  4 02:02:58 np0005470441 nova_compute[192626]: 2025-10-04 06:02:58.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  4 02:02:59 np0005470441 nova_compute[192626]: 2025-10-04 06:02:59.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
